update images to webp, update pre-commit hooks
@ -1,6 +1,7 @@
|
|||||||
personal_ws-1.1 en 0
|
personal_ws-1.1 en 0
|
||||||
AFAICT
|
AFAICT
|
||||||
ai
|
ai
|
||||||
|
AirBnB
|
||||||
anon
|
anon
|
||||||
anon's
|
anon's
|
||||||
Anthropic
|
Anthropic
|
||||||
@ -22,6 +23,7 @@ css
|
|||||||
Cyano
|
Cyano
|
||||||
Cyanogen
|
Cyanogen
|
||||||
debuffs
|
debuffs
|
||||||
|
decrypt
|
||||||
dev
|
dev
|
||||||
devs
|
devs
|
||||||
direnv
|
direnv
|
||||||
@ -68,6 +70,7 @@ nate
|
|||||||
nav
|
nav
|
||||||
Nephi
|
Nephi
|
||||||
NewPipe
|
NewPipe
|
||||||
|
nginx
|
||||||
Nim
|
Nim
|
||||||
Niri
|
Niri
|
||||||
nixos
|
nixos
|
||||||
|
|||||||
@ -38,7 +38,19 @@ else
|
|||||||
echo "⚠️ markdownlint not found, skipping markdown linting"
|
echo "⚠️ markdownlint not found, skipping markdown linting"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# --- Stage 2: Tag similarity check ---
|
# --- Stage 2: Interactive spell check ---
|
||||||
|
# Runs before tag check so typos in tags get corrected first
|
||||||
|
if [ -x "./scripts/spellcheck-interactive.sh" ]; then
|
||||||
|
if ! ./scripts/spellcheck-interactive.sh $STAGED_MD_FILES; then
|
||||||
|
echo "❌ Spell check failed."
|
||||||
|
OVERALL_RESULT=1
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
echo "⚠️ Spell check script not found or not executable, skipping spell check"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Stage 3: Tag similarity check ---
|
||||||
|
# Runs after spell check so corrected tags are compared
|
||||||
if command -v python3 &> /dev/null && [ -f "./scripts/check-tags.py" ]; then
|
if command -v python3 &> /dev/null && [ -f "./scripts/check-tags.py" ]; then
|
||||||
echo "Running tag similarity check..."
|
echo "Running tag similarity check..."
|
||||||
if ! python3 ./scripts/check-tags.py $STAGED_MD_FILES; then
|
if ! python3 ./scripts/check-tags.py $STAGED_MD_FILES; then
|
||||||
@ -49,16 +61,6 @@ else
|
|||||||
echo "⚠️ Tag checker (python3 or scripts/check-tags.py) not found, skipping tag check"
|
echo "⚠️ Tag checker (python3 or scripts/check-tags.py) not found, skipping tag check"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# --- Stage 3: Interactive spell check ---
|
|
||||||
if [ -x "./scripts/spellcheck-interactive.sh" ]; then
|
|
||||||
if ! ./scripts/spellcheck-interactive.sh $STAGED_MD_FILES; then
|
|
||||||
echo "❌ Spell check failed."
|
|
||||||
OVERALL_RESULT=1
|
|
||||||
fi
|
|
||||||
else
|
|
||||||
echo "⚠️ Spell check script not found or not executable, skipping spell check"
|
|
||||||
fi
|
|
||||||
|
|
||||||
# --- Stage 4: Link validation ---
|
# --- Stage 4: Link validation ---
|
||||||
if [ -x "./scripts/check-links.sh" ]; then
|
if [ -x "./scripts/check-links.sh" ]; then
|
||||||
echo "Running link validation..."
|
echo "Running link validation..."
|
||||||
|
|||||||
@ -53,7 +53,7 @@ customHeadHTML = '''
|
|||||||
'''
|
'''
|
||||||
# customFooterHTML = '<p>foot123</p>'
|
# customFooterHTML = '<p>foot123</p>'
|
||||||
togglePreviousAndNextButtons = "true"
|
togglePreviousAndNextButtons = "true"
|
||||||
avatarUrl = "/images/fosscat_icon.png"
|
avatarUrl = "/images/fosscat_icon.webp"
|
||||||
avatarSize = "size-s"
|
avatarSize = "size-s"
|
||||||
numberPostsOnHomePage = 5
|
numberPostsOnHomePage = 5
|
||||||
numberProjectsOnHomePage = 3
|
numberProjectsOnHomePage = 3
|
||||||
|
|||||||
@ -0,0 +1,66 @@
|
|||||||
|
---
|
||||||
|
date: 2026-03-03T23:23:32-07:00
|
||||||
|
# description: ""
|
||||||
|
# image: ""
|
||||||
|
lastmod: 2026-03-04T01:31:12-07:00
|
||||||
|
showTableOfContents: false
|
||||||
|
tags: ["stoicism", "philosophy"]
|
||||||
|
title: "The Stoic Practice of Negative Visualization"
|
||||||
|
type: "post"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Negative Visualization
|
||||||
|
|
||||||
|
I experimented today during my walk and meditation. I've taken interest at times to different practices and beliefs of the [Stoic (Wikipedia)](https://en.wikipedia.org/wiki/Stoicism) philosophy. I feel like Meditations by Marcus Aurelius really made the rounds a year or two ago amongst young millennial / gen z guys.
|
||||||
|
|
||||||
|
One of the standout practices to me is "futurorum malorum præmeditatio", or [negative visualization (Wikipedia)](https://en.wikipedia.org/wiki/Negative_visualization). It is believed to help with your resilience and gratitude. You imagine a bad scenario, losing something or someone dear to you. The reality of impermanence is brought forward intentionally in your mind.
|
||||||
|
|
||||||
|
While sitting on the special meditation rock, I tried my darndest to simply take in the view. I thought to myself, "man, I will really miss this place". I know it will be gone eventually. Its either bulldozed to erect a wonderful office building, or I move to a faraway land. I'm not a pessimist. I think this is just a fact of life. Eventually, we all get check-mated. Including the sitting rock.
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
## A Turn of Perspective
|
||||||
|
|
||||||
|
I caught myself. I wondered at all the times I, and others, use phrases like
|
||||||
|
|
||||||
|
> "I'm so excited for __ ..."
|
||||||
|
|
||||||
|
or
|
||||||
|
|
||||||
|
> "I'm going to miss __ ..."
|
||||||
|
|
||||||
|
or
|
||||||
|
|
||||||
|
> "I'm not looking forward to __ ..."
|
||||||
|
|
||||||
|
All living for the future.
|
||||||
|
|
||||||
|
Why don't I state how much I am enjoying the present moment? These are all futures that don't exist. They never will! Have I ever heard of someone pulling some future wish down to the present? The future is merely an idea in my head. Once I get to that future, its never what I thought exactly. Half of the time, I get to the future moment and am looking to the next one, not even enjoying the one I was really banking on enjoying. What a shame!
|
||||||
|
|
||||||
|
So instead, I thought to the sitting rock that I really enjoyed its company. I was grateful for the seat.
|
||||||
|
|
||||||
|
## Love & Loss
|
||||||
|
|
||||||
|
I then had an unexpected experience.
|
||||||
|
|
||||||
|
I sort of intuitively combined the Buddhist practice of Metta with the stoic's negative visualization.
|
||||||
|
|
||||||
|
{{<aside>}}
|
||||||
|
I imagined losing my life partner. The love of my life. What that would be like, being there as she left me here. Alone to hold things together. It was vague, the timeline wasn't very clear, would this be from old age or some early life health complication, I wasn't sure. But the feeling of loss I felt got very deep. I breathed in and out, trying to stay with the feeling. I could feel tears pricking at the corners of my eyes. I persisted with the sensation. Then, I cried.
|
||||||
|
{{</aside>}}
|
||||||
|
|
||||||
|
I don't recall ever willfully imagining a loss so profound, solely in my mind.
|
||||||
|
|
||||||
|
Breathing with this sensation, I realized that sadness was not exactly what I was feeling.
|
||||||
|
|
||||||
|
The loss was coupled with love. A love as deep as the sorrow. Like the two sides of your hand: one side clenches and the other releases, vice versa.
|
||||||
|
|
||||||
|
## Gratitude
|
||||||
|
|
||||||
|
I'm not claiming to be some wild guru, I don't really know what I'm doing with mindfulness considering my lack of time in the saddle. But this was an experience unlike I have ever had in a "spiritual" sense.
|
||||||
|
|
||||||
|
I don't know how to describe it, but the sensation of love and loss resonated in me. Realizing that they are almost one in the same.
|
||||||
|
|
||||||
|
I came home from that experience light on my feet. Giddy and grateful to get to kiss the love of my life, to see her as she is, and have another day to enjoy being with her.
|
||||||
|
|
||||||
|
I will certainly try to make the time to imagine terrible things more often :wink:
|
||||||
@ -5,7 +5,7 @@ draft: true
|
|||||||
tags:
|
tags:
|
||||||
summary:
|
summary:
|
||||||
cover:
|
cover:
|
||||||
image: "/images/img.jpg"
|
image: ""
|
||||||
# can also paste direct link from external site
|
# can also paste direct link from external site
|
||||||
# ex. https://i.ibb.co/K0HVPBd/paper-mod-profilemode.png
|
# ex. https://i.ibb.co/K0HVPBd/paper-mod-profilemode.png
|
||||||
alt: ""
|
alt: ""
|
||||||
|
|||||||
@ -5,7 +5,7 @@ draft: true
|
|||||||
tags:
|
tags:
|
||||||
summary:
|
summary:
|
||||||
cover:
|
cover:
|
||||||
image: "/images/img.jpg"
|
image: ""
|
||||||
# can also paste direct link from external site
|
# can also paste direct link from external site
|
||||||
# ex. https://i.ibb.co/K0HVPBd/paper-mod-profilemode.png
|
# ex. https://i.ibb.co/K0HVPBd/paper-mod-profilemode.png
|
||||||
alt: ""
|
alt: ""
|
||||||
|
|||||||
@ -1,8 +1,8 @@
|
|||||||
---
|
---
|
||||||
date: 2026-02-22T23:53:05-07:00
|
date: 2026-02-22T23:53:05-07:00
|
||||||
description: "Wondering what sort of algorithm my dog uses for navigation"
|
description: "Wondering what sort of algorithm my dog uses for navigation"
|
||||||
image: "/images/otto-on-nature-path-algorithm.jpg"
|
image: "/images/otto-on-nature-path-algorithm.webp"
|
||||||
lastmod: 2026-02-24T00:35:36-07:00
|
lastmod: 2026-03-04T01:31:12-07:00
|
||||||
showTableOfContents: false
|
showTableOfContents: false
|
||||||
tags: ["dogs", "optimization"]
|
tags: ["dogs", "optimization"]
|
||||||
title: "Dog Based Search Path"
|
title: "Dog Based Search Path"
|
||||||
|
|||||||
@ -1,11 +1,11 @@
|
|||||||
---
|
---
|
||||||
date: 2025-03-03T09:19:07-07:00
|
date: 2025-03-03T09:19:07-07:00
|
||||||
description: "I learned the importance of taking time away from the computer in software development"
|
description: "I learned the importance of taking time away from the computer in software development"
|
||||||
lastmod: 2026-02-23T01:08:53-07:00
|
lastmod: 2026-03-04T01:31:12-07:00
|
||||||
showTableOfContents: true
|
showTableOfContents: true
|
||||||
type: "post"
|
type: "post"
|
||||||
title: "TIL: Hammock Driven Development"
|
title: "TIL: Hammock Driven Development"
|
||||||
image: "images/hammock.jpg"
|
image: "/images/hammock.webp"
|
||||||
image_alt: "hammock with a cat"
|
image_alt: "hammock with a cat"
|
||||||
tags: ["clojure", "practices", "rich hickey", "til"]
|
tags: ["clojure", "practices", "rich hickey", "til"]
|
||||||
---
|
---
|
||||||
|
|||||||
@ -4,7 +4,7 @@ date: 2024-01-04T10:04:57-07:00
|
|||||||
description: 'How to host a mumble server on a subdomain behind nginx reverse proxy'
|
description: 'How to host a mumble server on a subdomain behind nginx reverse proxy'
|
||||||
tags: ["nginx"]
|
tags: ["nginx"]
|
||||||
showTableOfContents: true
|
showTableOfContents: true
|
||||||
image: "/images/nginx-mumble.png"
|
image: "/images/nginx-mumble.webp"
|
||||||
weight: 1
|
weight: 1
|
||||||
type: "post"
|
type: "post"
|
||||||
---
|
---
|
||||||
@ -13,7 +13,7 @@ type: "post"
|
|||||||
|
|
||||||
Well I couldn't find any actual examples of someone doing what I wanted, namely, hosting
|
Well I couldn't find any actual examples of someone doing what I wanted, namely, hosting
|
||||||
the murmur server on a subdomain on my machine behind an nginx proxy. I only have ports 80
|
the murmur server on a subdomain on my machine behind an nginx proxy. I only have ports 80
|
||||||
and 443 opened on my router, so I chose to recieve the mumble traffic to come in on port 443.
|
and 443 opened on my router, so I chose to receive the mumble traffic to come in on port 443.
|
||||||
Sounds easy enough, but the problem comes when you let nginx decrypt the packets in the process
|
Sounds easy enough, but the problem comes when you let nginx decrypt the packets in the process
|
||||||
of passing them to the murmur server, it raises a TLS/SSL Termination Error. Murmur insists on
|
of passing them to the murmur server, it raises a TLS/SSL Termination Error. Murmur insists on
|
||||||
End to End Encryption (E2EE), which is a good thing.
|
End to End Encryption (E2EE), which is a good thing.
|
||||||
@ -23,7 +23,7 @@ an Ad riddled page, here is the nginx config that got my setup working, all of t
|
|||||||
on an Arch Linux install, minus the `stream` block. Ports need to be defined for your setup for
|
on an Arch Linux install, minus the `stream` block. Ports need to be defined for your setup for
|
||||||
`INTERNAL_MUMBLE_PORT` (port that murmur is listening on) and `NEW_NGINX_SSL_PORT`. Previously,
|
`INTERNAL_MUMBLE_PORT` (port that murmur is listening on) and `NEW_NGINX_SSL_PORT`. Previously,
|
||||||
`NEW_NGINX_SSL_PORT` was 443, but the stream block now will be using 443, and you can't bind to the same
|
`NEW_NGINX_SSL_PORT` was 443, but the stream block now will be using 443, and you can't bind to the same
|
||||||
port with seperate services. So pick a new port for the other ssl nginx services to listen on,
|
port with separate services. So pick a new port for the other ssl nginx services to listen on,
|
||||||
as well as pass traffic to, internally.
|
as well as pass traffic to, internally.
|
||||||
|
|
||||||
`nginx.conf`
|
`nginx.conf`
|
||||||
|
|||||||
@ -4,7 +4,7 @@ date: 2022-08-31T20:38:09-06:00
|
|||||||
tags: ['self host', 'raspberry pi']
|
tags: ['self host', 'raspberry pi']
|
||||||
description: 'I talk about how the "cloud" works, and show how one can host a site on the internet'
|
description: 'I talk about how the "cloud" works, and show how one can host a site on the internet'
|
||||||
type: "post"
|
type: "post"
|
||||||
image: "/images/ocean-aerial.jpg"
|
image: "/images/ocean-aerial.webp"
|
||||||
showTableOfContents: true
|
showTableOfContents: true
|
||||||
weight: 1
|
weight: 1
|
||||||
---
|
---
|
||||||
@ -16,7 +16,7 @@ Back in my senior year of highschool, my buddies and I thought it would be funny
|
|||||||
|
|
||||||
## Internet, I've Heard of That
|
## Internet, I've Heard of That
|
||||||
|
|
||||||
I once heard the internet described as "the cloud", which is good to help people understand you know nothing about it. To give a marginally better explanation, imagine your brain, with all its neurons interconnected and whatnot. Lets call each neuron a "node". Each node holds information, and when it recieves a message it decides what to do with that information, modify it, store it, pass it on, sell it to the highest bidder for ad revenue, the possibilities are endless. In this way, the brain is much like the internet. These "nodes", or nuerons, are actually computers that make up the internet, a big web of interconnected, communicating devices. Our goal is to add a node to the network, and tell it to send specific information to anyone who calls on it.
|
I once heard the internet described as "the cloud", which is good to help people understand you know nothing about it. To give a marginally better explanation, imagine your brain, with all its neurons interconnected and whatnot. Lets call each neuron a "node". Each node holds information, and when it receives a message it decides what to do with that information, modify it, store it, pass it on, sell it to the highest bidder for ad revenue, the possibilities are endless. In this way, the brain is much like the internet. These "nodes", or nuerons, are actually computers that make up the internet, a big web of interconnected, communicating devices. Our goal is to add a node to the network, and tell it to send specific information to anyone who calls on it.
|
||||||
<!--
|
<!--
|
||||||
How did I get Here
|
How did I get Here
|
||||||
---
|
---
|
||||||
@ -37,10 +37,10 @@ If you want to put your stake on the great world wide web, then you need a few t
|
|||||||
|
|
||||||
I will be walking you through the steps I took to get you here on this web page. There are hundreds of ways to get something on the internet, and my way is certainly one of them. For reference, I am running Arch Linux btw on my main computer.
|
I will be walking you through the steps I took to get you here on this web page. There are hundreds of ways to get something on the internet, and my way is certainly one of them. For reference, I am running Arch Linux btw on my main computer.
|
||||||
|
|
||||||
Since I don't want to pay $5 a month for ease-of-use and stability and scalability, I will be using a rasperry pi zero w2 plugged into a charging brick behind my book shelf to be the node.
|
Since I don't want to pay $5 a month for ease-of-use and stability and scalability, I will be using a raspberry pi zero w2 plugged into a charging brick behind my book shelf to be the node.
|
||||||
|
|
||||||

|

|
||||||
My Rasperry Pi Zero w2 - How it is currently serving up the web
|
My Raspberry Pi Zero w2 - How it is currently serving up the web
|
||||||
|
|
||||||
## Get a Domain
|
## Get a Domain
|
||||||
|
|
||||||
@ -127,7 +127,7 @@ If you don't have a raspberry pi and instead coughed over $$$ to Jeff Bezos, the
|
|||||||
| A | www.<mydomain> | ip address |
|
| A | www.<mydomain> | ip address |
|
||||||
| A | <mydomain> | ip address |
|
| A | <mydomain> | ip address |
|
||||||
|
|
||||||
- If you are self-hosting like I am, you need to port-forward your device to make it visible on the world-wide-web. The steps to do this vary depending on your router. I fortunately and unfortunately have Google Fiber. So I can download Warzone in an hour but my resistance to an all-knowing data-collecting monolith feels futile. To port forward, you want to lock your raspberry pi's IP assigned by the router. This is done through DHCP. Then, open up your router's external port to the cooresponding internal raspberry pi port. For http you want your pi's port `80`. If you use SSL (which you should, its easy to setup), then use port `443`.
|
- If you are self-hosting like I am, you need to port-forward your device to make it visible on the world-wide-web. The steps to do this vary depending on your router. I fortunately and unfortunately have Google Fiber. So I can download Warzone in an hour but my resistance to an all-knowing data-collecting monolith feels futile. To port forward, you want to lock your raspberry pi's IP assigned by the router. This is done through DHCP. Then, open up your router's external port to the corresponding internal raspberry pi port. For http you want your pi's port `80`. If you use SSL (which you should, its easy to setup), then use port `443`.
|
||||||
|
|
||||||
:exclamation: DISCLAIMER :exclamation:
|
:exclamation: DISCLAIMER :exclamation:
|
||||||
|
|
||||||
|
|||||||
@ -6,7 +6,7 @@ tags:
|
|||||||
summary:
|
summary:
|
||||||
tocOpen: true
|
tocOpen: true
|
||||||
cover:
|
cover:
|
||||||
image: "/images/img.jpg"
|
image: ""
|
||||||
# can also paste direct link from external site
|
# can also paste direct link from external site
|
||||||
# ex. https://i.ibb.co/K0HVPBd/paper-mod-profilemode.png
|
# ex. https://i.ibb.co/K0HVPBd/paper-mod-profilemode.png
|
||||||
alt: ""
|
alt: ""
|
||||||
|
|||||||
@ -6,7 +6,7 @@ tags:
|
|||||||
summary:
|
summary:
|
||||||
tocOpen: true
|
tocOpen: true
|
||||||
cover:
|
cover:
|
||||||
image: "/images/img.jpg"
|
image: ""
|
||||||
# can also paste direct link from external site
|
# can also paste direct link from external site
|
||||||
# ex. https://i.ibb.co/K0HVPBd/paper-mod-profilemode.png
|
# ex. https://i.ibb.co/K0HVPBd/paper-mod-profilemode.png
|
||||||
alt: ""
|
alt: ""
|
||||||
|
|||||||
@ -4,7 +4,7 @@ date: 2022-11-18T14:31:04-07:00
|
|||||||
tags: ['travel', 'japan']
|
tags: ['travel', 'japan']
|
||||||
description: 'We arrived to Tokyo, experienced some of the city as we found our way to our hotel for the night.'
|
description: 'We arrived to Tokyo, experienced some of the city as we found our way to our hotel for the night.'
|
||||||
showTableOfContents: true
|
showTableOfContents: true
|
||||||
image: "/images/japan-arrival.JPG"
|
image: "/images/japan-arrival.webp"
|
||||||
weight: 1
|
weight: 1
|
||||||
type: "post"
|
type: "post"
|
||||||
---
|
---
|
||||||
@ -15,18 +15,18 @@ My wife and I finally get to experience first hand all of the things we watched
|
|||||||
|
|
||||||
## Flight Experience
|
## Flight Experience
|
||||||
|
|
||||||
We had a short flight to Seattle early in the morning, an hour wait, then hopped on the massive 10-people-per-row, serves-you-two-meals passenger plane. I've never flown internationally to this extent, so this was all very new experiece for me. The food was surprisingly delicious, my butt was unsurprisingly sore after a 9.5 hour flight. Things passed honestly quicker than I anticipated. We arrived in Japan at 2PM the next day. Pretty trippy, but sleeping on the plane kinda felt like we didn't just lose a day to the time-zone lords. We were both pretty frazzled after getting off the plane, but there were a ton of Japanese people pointing and holding signs of where to go to get processed through the system. When I was getting my fringerprints scanned, I didn't know it was also taking a picture. My nose was itchy so I did what anyone would do with an itchy nose and occupied hands, I scrunched my face rapidly and desperately. When my fingers were done it showed me my lemon-head motion-blurred image of a fugitive. Hopefully there is no system that flags me. Inside the airport, we purchased a SIM card for one of our phones to have data, and luckily found super friendly english-speaking worker that told us how to get to our hotel. We took a short one hour bus ride from Haneda to Shinjuku where we will be staying for the next three days.
|
We had a short flight to Seattle early in the morning, an hour wait, then hopped on the massive 10-people-per-row, serves-you-two-meals passenger plane. I've never flown internationally to this extent, so this was all very new experience for me. The food was surprisingly delicious, my butt was unsurprisingly sore after a 9.5 hour flight. Things passed honestly quicker than I anticipated. We arrived in Japan at 2PM the next day. Pretty trippy, but sleeping on the plane kinda felt like we didn't just lose a day to the time-zone lords. We were both pretty frazzled after getting off the plane, but there were a ton of Japanese people pointing and holding signs of where to go to get processed through the system. When I was getting my fingerprints scanned, I didn't know it was also taking a picture. My nose was itchy so I did what anyone would do with an itchy nose and occupied hands, I scrunched my face rapidly and desperately. When my fingers were done it showed me my lemon-head motion-blurred image of a fugitive. Hopefully there is no system that flags me. Inside the airport, we purchased a SIM card for one of our phones to have data, and luckily found super friendly english-speaking worker that told us how to get to our hotel. We took a short one hour bus ride from Haneda to Shinjuku where we will be staying for the next three days.
|
||||||
|
|
||||||
## The City
|
## The City
|
||||||
|
|
||||||
Immediately after getting off the bus, we found ourselves in the heart of a cement and steel labrynth. I've been to New York, but otherwise the largest city I've been in is Phoenix. Shinjuku (which to my understanding is inside the "bigger" Tokyo, but I think there is a Tokyo, Tokyo, kinda like NY, NY..?) is huge, with every development towering over the people below. Carrying luggage in Shinjuku was no sweat, wide sidewalks and side streets dedicated to foot traffic makes walking around here super convenient. We didn't realize that the hotel we booked was a massive chain, like a Hilton, but with the compactness of Tokyo, there were APA hotels just across the street from each other. So it took us not a few conversations with APA receptionists to find the right one. My wife and I were told that inside Tokyo we would be fine with English, and so far that is barely true. The language barrier is very apparent, and we struggle to understand pretty frequently the english japanese people can speak. But if anything it adds to the adventure of it all.
|
Immediately after getting off the bus, we found ourselves in the heart of a cement and steel labrynth. I've been to New York, but otherwise the largest city I've been in is Phoenix. Shinjuku (which to my understanding is inside the "bigger" Tokyo, but I think there is a Tokyo, Tokyo, kinda like NY, NY..?) is huge, with every development towering over the people below. Carrying luggage in Shinjuku was no sweat, wide sidewalks and side streets dedicated to foot traffic makes walking around here super convenient. We didn't realize that the hotel we booked was a massive chain, like a Hilton, but with the compactness of Tokyo, there were APA hotels just across the street from each other. So it took us not a few conversations with APA receptionists to find the right one. My wife and I were told that inside Tokyo we would be fine with English, and so far that is barely true. The language barrier is very apparent, and we struggle to understand pretty frequently the english Japanese people can speak. But if anything it adds to the adventure of it all.
|
||||||
|
|
||||||
## Shopping
|
## Shopping
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
Of course the first thing we had to do was shop! The city is filled with conbinis (cone-bee-knees, small convience stores, 7/11 brand :open_mouth:), H&M looking shopping stores, sooo many bars, girls in super tall boots holding signs I can't read so I have no idea what they are up to, and people! Its an anxious feeling, in the excited and nervous sense, feeling so small in such a huge city.
|
Of course the first thing we had to do was shop! The city is filled with conbinis (cone-bee-knees, small convenience stores, 7/11 brand :open_mouth:), H&M looking shopping stores, sooo many bars, girls in super tall boots holding signs I can't read so I have no idea what they are up to, and people! Its an anxious feeling, in the excited and nervous sense, feeling so small in such a huge city.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
We saw a 6 story McDonalds on our way to the hotel so we satisfied our cravings there then went to a store called UNIQLO. Everything is so clean here, people have been really kind, and they even laugh when I say "konichiwa" (I won't read into why, feels better to assume my humor is bilingual). After about an hour of shopping we headed back to our room. We hit the sac around 9 and sleep caught us faster than we hoped.
|
We saw a 6 story McDonalds on our way to the hotel so we satisfied our cravings there then went to a store called UNIQLO. Everything is so clean here, people have been really kind, and they even laugh when I say "konichiwa" (I won't read into why, feels better to assume my humor is bilingual). After about an hour of shopping we headed back to our room. We hit the sac around 9 and sleep caught us faster than we hoped.
|
||||||
|
|||||||
@ -5,7 +5,7 @@ tags: ['japan', 'travel', 'cats']
|
|||||||
description: 'Our second day was full of cats, visiting Nippori and walking down the "shotengai", or the main shopping street.'
|
description: 'Our second day was full of cats, visiting Nippori and walking down the "shotengai", or the main shopping street.'
|
||||||
type: "post"
|
type: "post"
|
||||||
showTableOfContents: true
|
showTableOfContents: true
|
||||||
image: "/images/japan-nippori-walk.JPG"
|
image: "/images/japan-nippori-walk.webp"
|
||||||
---
|
---
|
||||||
|
|
||||||
# Japan is Fond of Their Cats
|
# Japan is Fond of Their Cats
|
||||||
@ -14,27 +14,27 @@ My wife and I have two cats of our own, so naturally we had to experience the ca
|
|||||||
|
|
||||||
## Morning - Garden Walk
|
## Morning - Garden Walk
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
We had our equivalent of a 'continental' breakfast at our hotel, which was leaps ahead of American hosting hospitality standards. Plus, it was largely fats and protiens, which is a much better way to check off consuming your 'most important meal of the day'. We then headed to Shinjuku National Garden - $5 admission, and enjoyed the h\*ck out of the 65 degree weather. We walked and viewed the grounds, the whole garden is probably a mile across. Before we left we found a little bench in the sun for my wife to paint, she brought a pocket watercolor set. Away to Niporri to meet En and cat city.
|
We had our equivalent of a 'continental' breakfast at our hotel, which was leaps ahead of American hosting hospitality standards. Plus, it was largely fats and proteins, which is a much better way to check off consuming your 'most important meal of the day'. We then headed to Shinjuku National Garden - $5 admission, and enjoyed the h\*ck out of the 65 degree weather. We walked and viewed the grounds, the whole garden is probably a mile across. Before we left we found a little bench in the sun for my wife to paint, she brought a pocket watercolor set. Away to Niporri to meet En and cat city.
|
||||||
|
|
||||||
## Afternoon - Cat City
|
## Afternoon - Cat City
|
||||||
|
|
||||||
En is from Japan, learned English in Scottland of all places (she spoke english with a Japanese accent instead of Scottish, which disappointed me just a little), and was such a kind guide. We walked down the shotengai of Nippori, the main shopping street of a city, exploring all the little shops on each side. Each store was themed it seemed, like bamboo goods, paulownia wood boxes, and of course, cat shops, selling things to show your love for cats, not to care for them. We loved it. There were also a number of local art galleries, often hosted inside historic homes in the neighborhood that had been renovated. It was really interesting to walk through these renovated-homes-converted-art-cafes and see traditional alongside contemorary art by the local residents, while people order teas and coffee at the cafe.
|
En is from Japan, learned English in Scottland of all places (she spoke English with a Japanese accent instead of Scottish, which disappointed me just a little), and was such a kind guide. We walked down the shotengai of Nippori, the main shopping street of a city, exploring all the little shops on each side. Each store was themed it seemed, like bamboo goods, paulownia wood boxes, and of course, cat shops, selling things to show your love for cats, not to care for them. We loved it. There were also a number of local art galleries, often hosted inside historic homes in the neighborhood that had been renovated. It was really interesting to walk through these renovated-homes-converted-art-cafes and see traditional alongside contemporary art by the local residents, while people order teas and coffee at the cafe.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
This house called itself the nekomachi, or cat-town of the neighborhood, with a gallery inside of exclusively cat-themed art.
|
This house called itself the nekomachi, or cat-town of the neighborhood, with a gallery inside of exclusively cat-themed art.
|
||||||
|
|
||||||
It was probbably the most 'local' thing we will do on our trip, we loved it. We ended our walk in Niporri passing through the neighborhood's 'graveyard', in the word of En, though I think the western idea of graveyard is tainted with Halloween spooky connotations. En told us that it has been full for probably 30 years, with any new bodies desiring to be buried needing to relocate further from Tokyo. Some of the gravesites had massive stones with characters carved into it's face. En told us they chronicled the person's life and their family, not unlike a westerner's tombstone, just more verbose. I should mention that we stopped and purchased ourselves some taiyaki, a fish pastry filled with sweet red bean paste, and my wife was crazy for it. They are pretty good.
|
It was probably the most 'local' thing we will do on our trip, we loved it. We ended our walk in Niporri passing through the neighborhood's 'graveyard', in the word of En, though I think the western idea of graveyard is tainted with Halloween spooky connotations. En told us that it has been full for probably 30 years, with any new bodies desiring to be buried needing to relocate further from Tokyo. Some of the gravesites had massive stones with characters carved into it's face. En told us they chronicled the person's life and their family, not unlike a westerner's tombstone, just more verbose. I should mention that we stopped and purchased ourselves some taiyaki, a fish pastry filled with sweet red bean paste, and my wife was crazy for it. They are pretty good.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
## Evening - Cat Cafe and Ramen
|
## Evening - Cat Cafe and Ramen
|
||||||
|
|
||||||
We ended our evening visiting Akihabara. We were hoping to enjoy its renowned anime consumer content, but were disappointed to not find any stickers or shirts, just pins, figurines, charms, manga, flags, posters, chibis, funko pops, and probably every other piece of parafanelia. I guess the Japanese just do it differently. We were pretty hungry so we headed over to the cat cafe. You pay for time and get to use an auto dispensing hot/cold drink machine (coffee, tea, cocoa) and pet any of the 20+ cats in the space. It had a good vibe, and the cats were beyond my comprehension soft. They also were all so aloof, they get so much attention and stimulation they can get anything they want, so they all were very uninterested in you. Felt insulting. We finished the evening by getting some genuine 'pork oil noodle' ramen at a walk-up ramen bar. The taste was pretty awesome, just had to get over the idea of them pouring pork oil on the noodles.
|
We ended our evening visiting Akihabara. We were hoping to enjoy its renowned anime consumer content, but were disappointed to not find any stickers or shirts, just pins, figurines, charms, manga, flags, posters, chibis, funko pops, and probably every other piece of parafanelia. I guess the Japanese just do it differently. We were pretty hungry so we headed over to the cat cafe. You pay for time and get to use an auto dispensing hot/cold drink machine (coffee, tea, cocoa) and pet any of the 20+ cats in the space. It had a good vibe, and the cats were beyond my comprehension soft. They also were all so aloof, they get so much attention and stimulation they can get anything they want, so they all were very uninterested in you. Felt insulting. We finished the evening by getting some genuine 'pork oil noodle' ramen at a walk-up ramen bar. The taste was pretty awesome, just had to get over the idea of them pouring pork oil on the noodles.
|
||||||
|
|
||||||
We had a super fun second day. Felt less wiped at the end of the day, but our feet were aching. I was wearing my minmalist flip-flops because the weather was so nice, but that is probably going to end.
|
We had a super fun second day. Felt less wiped at the end of the day, but our feet were aching. I was wearing my minimalist flip-flops because the weather was so nice, but that is probably going to end.
|
||||||
|
|
||||||
|
|
||||||
## My Japan Travel Tips (JTT)
|
## My Japan Travel Tips (JTT)
|
||||||
|
|||||||
@ -4,7 +4,7 @@ date: 2022-11-21T05:19:38-07:00
|
|||||||
tags: ['travel', 'japan']
|
tags: ['travel', 'japan']
|
||||||
description: 'We packed up from Shinjuku and headed to a small town near the base of Mount Fuji and experienced the generosity and kindness of an old man there.'
|
description: 'We packed up from Shinjuku and headed to a small town near the base of Mount Fuji and experienced the generosity and kindness of an old man there.'
|
||||||
showTableOfContents: true
|
showTableOfContents: true
|
||||||
image: "/images/japan-fuji.JPG"
|
image: "/images/japan-fuji.webp"
|
||||||
weight: 1
|
weight: 1
|
||||||
type: "post"
|
type: "post"
|
||||||
---
|
---
|
||||||
@ -18,20 +18,20 @@ Waking up this morning we weren't in a rush, but we felt like we had no idea how
|
|||||||
My wife asked if he knew where any good shrines were in the area, anticipating some directions, and he just said something along the lines of "ah, we go". He took us straight there. He taught us the customs and etiquette of visiting the shrines:
|
My wife asked if he knew where any good shrines were in the area, anticipating some directions, and he just said something along the lines of "ah, we go". He took us straight there. He taught us the customs and etiquette of visiting the shrines:
|
||||||
- When entering through the gate (Torii, the big red curved beam spanning across red columns) you bow twice
|
- When entering through the gate (Torii, the big red curved beam spanning across red columns) you bow twice
|
||||||
- Also when entering you walk on one side of the path, not the middle. He didn't know the word, but my guess is its reserved for either religious or royal persons.
|
- Also when entering you walk on one side of the path, not the middle. He didn't know the word, but my guess is its reserved for either religious or royal persons.
|
||||||
- At this shrine there was a fountain with multiple spigots, he told us to wash our hands, but motioned not to rinse your mouth and spit out the water. I didn't have a problem supressing my urge to rinse out my mouth at the sight of running water, so I considered myself lucky.
|
- At this shrine there was a fountain with multiple spigots, he told us to wash our hands, but motioned not to rinse your mouth and spit out the water. I didn't have a problem suppressing my urge to rinse out my mouth at the sight of running water, so I considered myself lucky.
|
||||||
- At the back of the grounds is the shrine offering. You toss a 5-50 yen coin in it and bow twice, then clap twice, then bow one more time, during which you make a wish and "remember it in your heart".
|
- At the back of the grounds is the shrine offering. You toss a 5-50 yen coin in it and bow twice, then clap twice, then bow one more time, during which you make a wish and "remember it in your heart".
|
||||||
- As you leave, you turn back to shrine and bow once more.
|
- As you leave, you turn back to shrine and bow once more.
|
||||||
|
|
||||||

|

|
||||||
Our host told us that its believed that the god lives on top. Its probably 40+ feet around at the base.
|
Our host told us that its believed that the god lives on top. Its probably 40+ feet around at the base.
|
||||||
|
|
||||||
Not sure how much got lost in translation, but it seemed we did a good job of following his instructions. He then dropped us off by Lake Kawaguchi, which had a stunning view of Mount Fuji with the sun conveniently setting. He left us to sight see some more as the valley was swallowed in Mt. Fuji's shadow. Wow I'm impressed by my literary devices tonight.
|
Not sure how much got lost in translation, but it seemed we did a good job of following his instructions. He then dropped us off by Lake Kawaguchi, which had a stunning view of Mount Fuji with the sun conveniently setting. He left us to sight see some more as the valley was swallowed in Mt. Fuji's shadow. Wow I'm impressed by my literary devices tonight.
|
||||||
|
|
||||||
## Walking the Neighborhood
|
## Walking the Neighborhood
|
||||||
|
|
||||||
It got dark really quickly, and much cooler than Tokyo got in the evenings. We stopped a local walmart equivalent and purchased some gloves and ramen cups to cook once we got home. We wanted to go out for food, but Mondays are like western Sundays, and every shop we tried was closed.
|
It got dark really quickly, and much cooler than Tokyo got in the evenings. We stopped a local Walmart equivalent and purchased some gloves and ramen cups to cook once we got home. We wanted to go out for food, but Mondays are like western Sundays, and every shop we tried was closed.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
There was one open but it has questionable reviews and pictures on google, so we decided better stick with the MSG than food poisoning. A note I've noticed about portions here in Japan: everything is packaged in really sensibly sized portions. Its almost as if the governing body of Japan and food distributors are colluding together to get us to not over eat. It feels really intrusive.
|
There was one open but it has questionable reviews and pictures on google, so we decided better stick with the MSG than food poisoning. A note I've noticed about portions here in Japan: everything is packaged in really sensibly sized portions. Its almost as if the governing body of Japan and food distributors are colluding together to get us to not over eat. It feels really intrusive.
|
||||||
|
|
||||||
@ -39,6 +39,6 @@ There was one open but it has questionable reviews and pictures on google, so we
|
|||||||
|
|
||||||
Our host was greatly amused at our noodle cup dinner. He heated up some water for us and pulled out wrapped frozen pork he had prepared and told us to heat it up to put in our ramen. It definitely leveled up the quality.
|
Our host was greatly amused at our noodle cup dinner. He heated up some water for us and pulled out wrapped frozen pork he had prepared and told us to heat it up to put in our ramen. It definitely leveled up the quality.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
His house is off-grid with solar panels, with lots of custom wood carpentry, he told us its a hobby of his. He also runs a small cafe out of the house that is all organic, he serves food he grows in his little farm. Its a pretty incredible passion project of his, all in an effort he says to reduce his CO2 emissions, as global warming has increased the size and damage of typhoons to Japan. His eletric car also has a big sticker on it about being in some sort of EV club that I suspect views him as the president. As he showed us the house, he mentioned that we could use his 'onsen' (hot spring / public bath). He shows us into a sauna looking room with a large wooden bath. We happily accepted the offer. We had a wonderfully warm evening after some trekking out in the cold.
|
His house is off-grid with solar panels, with lots of custom wood carpentry, he told us its a hobby of his. He also runs a small cafe out of the house that is all organic, he serves food he grows in his little farm. Its a pretty incredible passion project of his, all in an effort he says to reduce his CO2 emissions, as global warming has increased the size and damage of typhoons to Japan. His electric car also has a big sticker on it about being in some sort of EV club that I suspect views him as the president. As he showed us the house, he mentioned that we could use his 'onsen' (hot spring / public bath). He shows us into a sauna looking room with a large wooden bath. We happily accepted the offer. We had a wonderfully warm evening after some trekking out in the cold.
|
||||||
|
|||||||
@ -4,7 +4,7 @@ date: 2022-12-27T00:47:29-07:00
|
|||||||
description: 'I take a trip down memory lane, explaining how I got my first real job as a developer.'
|
description: 'I take a trip down memory lane, explaining how I got my first real job as a developer.'
|
||||||
tags: ["work", "thoughts"]
|
tags: ["work", "thoughts"]
|
||||||
showTableOfContents: true
|
showTableOfContents: true
|
||||||
image: "/images/monochrome-path.jpg"
|
image: "/images/monochrome-path.webp"
|
||||||
weight: 1
|
weight: 1
|
||||||
type: "post"
|
type: "post"
|
||||||
---
|
---
|
||||||
@ -55,7 +55,7 @@ position, the lead dev says "So we both like you, we want you on the team." This
|
|||||||
my gauntlet technical obstacle course? But obviously I was thrilled. The dating phase ended as quickly as it began, and
|
my gauntlet technical obstacle course? But obviously I was thrilled. The dating phase ended as quickly as it began, and
|
||||||
with that, I found myself a married to the corporate system.
|
with that, I found myself a married to the corporate system.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
Incoming!! Analogies to life as an rpg with points assigned to various traits that may be more genetic / permanent
|
Incoming!! Analogies to life as an rpg with points assigned to various traits that may be more genetic / permanent
|
||||||
than I make it seem. Character composition is a little more that just stats, buffs / debuffs, and inventory. But only
|
than I make it seem. Character composition is a little more that just stats, buffs / debuffs, and inventory. But only
|
||||||
|
|||||||
@ -6,7 +6,7 @@ tags:
|
|||||||
summary:
|
summary:
|
||||||
tocOpen: true
|
tocOpen: true
|
||||||
cover:
|
cover:
|
||||||
image: "/images/img.jpg"
|
image: ""
|
||||||
# can also paste direct link from external site
|
# can also paste direct link from external site
|
||||||
# ex. https://i.ibb.co/K0HVPBd/paper-mod-profilemode.png
|
# ex. https://i.ibb.co/K0HVPBd/paper-mod-profilemode.png
|
||||||
alt: ""
|
alt: ""
|
||||||
|
|||||||
@ -1,9 +1,9 @@
|
|||||||
---
|
---
|
||||||
date: 2026-02-21T19:52:06-07:00
|
date: 2026-02-21T19:52:06-07:00
|
||||||
description: "An old parable poorly applied"
|
description: "An old parable poorly applied"
|
||||||
image: "images/elephant-and-blind-men.png"
|
image: "/images/elephant-and-blind-men.webp"
|
||||||
image_alt: "Still of an elephant and blind men from the animated short video Discovering Truth by the LDS YouTube channel."
|
image_alt: "Still of an elephant and blind men from the animated short video Discovering Truth by the LDS YouTube channel."
|
||||||
lastmod: 2026-02-23T01:08:53-07:00
|
lastmod: 2026-03-04T01:31:12-07:00
|
||||||
showTableOfContents: false
|
showTableOfContents: false
|
||||||
tags: ["philosophy", "buddhism", "truth"]
|
tags: ["philosophy", "buddhism", "truth"]
|
||||||
title: "Seeing the Blind Men and the Elephant"
|
title: "Seeing the Blind Men and the Elephant"
|
||||||
@ -33,7 +33,7 @@ In the interview, Swami Sarvapriyananda mentioned an old Buddhist parable, [_the
|
|||||||
> And prate about an Elephant
|
> And prate about an Elephant
|
||||||
> Not one of them has seen!
|
> Not one of them has seen!
|
||||||
|
|
||||||
I suprisingly recognized the parable, from a much earlier time in my life, from a very different source. I could even picture the animated elephant with the westerner's stereotypical depiction of "Indians" (eastern kind) grasping at its parts.
|
I surprisingly recognized the parable, from a much earlier time in my life, from a very different source. I could even picture the animated elephant with the westerner's stereotypical depiction of "Indians" (eastern kind) grasping at its parts.
|
||||||
|
|
||||||
## The Whole Truth
|
## The Whole Truth
|
||||||
|
|
||||||
@ -46,7 +46,7 @@ The church later made a short animated video, clipping up his talk in a nice por
|
|||||||
I reflected today on the possibility that this parable of the elephant may have been misconstrued or appropriated by Uchtdorf for his, or more generally, for the church's, purposes. I wondered how exactly he had presented the parable those years ago, and if it matched its original intended purpose.
|
I reflected today on the possibility that this parable of the elephant may have been misconstrued or appropriated by Uchtdorf for his, or more generally, for the church's, purposes. I wondered how exactly he had presented the parable those years ago, and if it matched its original intended purpose.
|
||||||
|
|
||||||
|
|
||||||
I think my gut instinct was to use this as some sort of fodder. Pointing out a mistake on the part of Uchtdorf could really reaffirm my decision to step away from the mormon faith. On further reflection and researching for this post, I think it makes more sense, and is intelectually more honest, to compare how this story has been applied. As Wikipedia states, the parable "has been used to illustrate a range of truths and fallacies".
|
I think my gut instinct was to use this as some sort of fodder. Pointing out a mistake on the part of Uchtdorf could really reaffirm my decision to step away from the Mormon faith. On further reflection and researching for this post, I think it makes more sense, and is intellectually more honest, to compare how this story has been applied. As Wikipedia states, the parable "has been used to illustrate a range of truths and fallacies".
|
||||||
|
|
||||||
And isn't that how parables, allegories, and friends all work? We mold them to our time. The cultural morays shape the parts of the elephant the blind men see. For the 500 B.C.E. crowd its a plowshare, a mortar, and a pestle, and for us modern folk, a spear, a tree, and a snake; for an elephant's tusk, leg, and tail respectively.
|
And isn't that how parables, allegories, and friends all work? We mold them to our time. The cultural morays shape the parts of the elephant the blind men see. For the 500 B.C.E. crowd its a plowshare, a mortar, and a pestle, and for us modern folk, a spear, a tree, and a snake; for an elephant's tusk, leg, and tail respectively.
|
||||||
|
|
||||||
|
|||||||
@ -3,7 +3,7 @@ title: "The Migration to Arch"
|
|||||||
date: 2023-08-15T02:04:21-06:00
|
date: 2023-08-15T02:04:21-06:00
|
||||||
description:
|
description:
|
||||||
showTableOfContents: true
|
showTableOfContents: true
|
||||||
image: "/images/arch-logo.png"
|
image: "/images/arch-logo.webp"
|
||||||
weight: 1
|
weight: 1
|
||||||
type: "post"
|
type: "post"
|
||||||
---
|
---
|
||||||
@ -18,7 +18,7 @@ As they say, there are two wolves inside each man, one that craves stable, calm
|
|||||||
hood, fat finger `rm -rf /` and other monstrosities that I don't care to joke about because they hurt me too much.
|
hood, fat finger `rm -rf /` and other monstrosities that I don't care to joke about because they hurt me too much.
|
||||||
|
|
||||||
So, I moved off of the Raspberry Pi Zero W2, and on to a much more legitimate pc build. The blog could have run fine on the pi, but it didn't take long for me to
|
So, I moved off of the Raspberry Pi Zero W2, and on to a much more legitimate pc build. The blog could have run fine on the pi, but it didn't take long for me to
|
||||||
feel justified in spending some money on a faster machine. Bought it second hand from a crypto mining rig, swapped out the pentium for a respectable i7 of some
|
feel justified in spending some money on a faster machine. Bought it second hand from a crypto mining rig, swapped out the Pentium for a respectable i7 of some
|
||||||
recent generation, and I had an upcycled machine ready for some data crunching!
|
recent generation, and I had an upcycled machine ready for some data crunching!
|
||||||
|
|
||||||
## The Services
|
## The Services
|
||||||
@ -42,7 +42,7 @@ Eventually, I envision a whole bunch of services running in my 'home lab', as ho
|
|||||||
|
|
||||||
Arch Linux was not actually the first OS I put on this new (to me) machine. I had drunk from the FSF goblet and got it in my head to try out
|
Arch Linux was not actually the first OS I put on this new (to me) machine. I had drunk from the FSF goblet and got it in my head to try out
|
||||||
[Guix System, or Guix SD,](https://guix.gnu.org/) or whatever its officially called. I recommend checking it out. It was a couple month adventure, but I think my lisp-less mind
|
[Guix System, or Guix SD,](https://guix.gnu.org/) or whatever its officially called. I recommend checking it out. It was a couple month adventure, but I think my lisp-less mind
|
||||||
couldn't handle the parantheses required. It has some really cool concepts similar to NixOS, your whole system (users, installed packages, mount points, etc) is defined in one or
|
couldn't handle the parentheses required. It has some really cool concepts similar to NixOS, your whole system (users, installed packages, mount points, etc) is defined in one or
|
||||||
more files. Despite its great documentation, it is hard to understand what is going on without some good Guile / Scheme fundamentals, which I lack. So, back to what I know: Arch (btw).
|
more files. Despite its great documentation, it is hard to understand what is going on without some good Guile / Scheme fundamentals, which I lack. So, back to what I know: Arch (btw).
|
||||||
|
|
||||||
## What Did I Learn?
|
## What Did I Learn?
|
||||||
@ -61,7 +61,7 @@ But, for only ~$5.00 a month, I get access to a sftp server with 1TB capacity, w
|
|||||||
to a HDD drive connected to the computer. I have lots more to say about so many of these things, I spent pretty much all day configuring everything and getting it all to work, but I
|
to a HDD drive connected to the computer. I have lots more to say about so many of these things, I spent pretty much all day configuring everything and getting it all to work, but I
|
||||||
told myself to prioritize frequency over comprehensiveness, so I will leave it at that.
|
told myself to prioritize frequency over comprehensiveness, so I will leave it at that.
|
||||||
|
|
||||||
Oh! I almost forgot, if you want to use the gitea instance I am running as an alternative to having your code fed to AI models, please email me! Im no enterprise company with infinite
|
Oh! I almost forgot, if you want to use the gitea instance I am running as an alternative to having your code fed to AI models, please email me! I'm no enterprise company with infinite
|
||||||
backups and resources, but I'd love to share what little I have if it would be useful to you!
|
backups and resources, but I'd love to share what little I have if it would be useful to you!
|
||||||
|
|
||||||
Thanks for stopping in :)
|
Thanks for stopping in :)
|
||||||
|
|||||||
@ -1,11 +1,11 @@
|
|||||||
---
|
---
|
||||||
date: 2025-10-03T16:19:07-06:00
|
date: 2025-10-03T16:19:07-06:00
|
||||||
description: "We were made for dogs, and they us."
|
description: "We were made for dogs, and they us."
|
||||||
lastmod: 2025-10-03T16:19:07-06:00
|
lastmod: 2026-03-04T01:31:12-07:00
|
||||||
showTableOfContents: true
|
showTableOfContents: true
|
||||||
type: "post"
|
type: "post"
|
||||||
title: "TIL: We Created Dogs and Dogs Created Us"
|
title: "TIL: We Created Dogs and Dogs Created Us"
|
||||||
image: "images/otto-1.webp"
|
image: "/images/otto-1.webp"
|
||||||
image_caption: "Otto, Stalwart"
|
image_caption: "Otto, Stalwart"
|
||||||
image_alt: "Image of my sweet pup Otto, Irish Setter 7 months"
|
image_alt: "Image of my sweet pup Otto, Irish Setter 7 months"
|
||||||
tags: ["life", "dogs", "history", "til"]
|
tags: ["life", "dogs", "history", "til"]
|
||||||
|
|||||||
13
flake.nix
@ -11,6 +11,12 @@
|
|||||||
let
|
let
|
||||||
pkgs = nixpkgs.legacyPackages.${system};
|
pkgs = nixpkgs.legacyPackages.${system};
|
||||||
in
|
in
|
||||||
|
let
|
||||||
|
tagCheckPython = pkgs.python313.withPackages (ps: [
|
||||||
|
ps.spacy
|
||||||
|
ps.spacy-models.en_core_web_lg
|
||||||
|
]);
|
||||||
|
in
|
||||||
{
|
{
|
||||||
devShells.default = pkgs.mkShell {
|
devShells.default = pkgs.mkShell {
|
||||||
buildInputs = with pkgs; [
|
buildInputs = with pkgs; [
|
||||||
@ -21,7 +27,12 @@
|
|||||||
aspell
|
aspell
|
||||||
aspellDicts.en
|
aspellDicts.en
|
||||||
fzf # Interactive spell check and tag selection
|
fzf # Interactive spell check and tag selection
|
||||||
python3 # Tag similarity checker
|
tagCheckPython # Python + spaCy for semantic tag similarity checker
|
||||||
|
|
||||||
|
# Image optimization tools (used by scripts/optimize-images.sh)
|
||||||
|
perl538Packages.ImageExifTool # EXIF metadata reading/stripping
|
||||||
|
imagemagick # Resize, auto-orient, get dimensions
|
||||||
|
libwebp # cwebp for WebP conversion
|
||||||
];
|
];
|
||||||
|
|
||||||
shellHook = ''
|
shellHook = ''
|
||||||
|
|||||||
@ -1,13 +1,15 @@
|
|||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
"""
|
"""
|
||||||
check-tags.py — Tag similarity checker for Hugo content
|
check-tags.py — Semantic tag similarity checker for Hugo content
|
||||||
|
|
||||||
Compares tags in staged files against all existing tags in the site.
|
Compares tags in staged files against all existing tags in the site.
|
||||||
Warns and blocks commit when a new tag looks similar to an existing one.
|
Warns and blocks commit when a new tag is semantically similar to an existing one.
|
||||||
|
|
||||||
Similarity checks (via difflib.SequenceMatcher):
|
Uses spaCy word vectors (en_core_web_lg) for cosine similarity — catches
|
||||||
- Ratio >= 0.6 (catches typos, reordered chars, partial matches)
|
conceptual matches like "parenting" ≈ "fatherhood" while ignoring unrelated
|
||||||
- One tag is a substring of the other
|
words that happen to share letters like "dogs" vs "daily".
|
||||||
|
|
||||||
|
Fallback: if spaCy is unavailable, uses conservative edit-distance checks only.
|
||||||
|
|
||||||
Skip with: SKIP_TAG_CHECK=1 git commit
|
Skip with: SKIP_TAG_CHECK=1 git commit
|
||||||
|
|
||||||
@ -17,6 +19,7 @@ Usage: check-tags.py <file1.md> [file2.md ...]
|
|||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
import sys
|
import sys
|
||||||
|
import time
|
||||||
from difflib import SequenceMatcher
|
from difflib import SequenceMatcher
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
@ -28,7 +31,44 @@ CYAN = "\033[0;36m"
|
|||||||
BOLD = "\033[1m"
|
BOLD = "\033[1m"
|
||||||
NC = "\033[0m"
|
NC = "\033[0m"
|
||||||
|
|
||||||
SIMILARITY_THRESHOLD = 0.6 # SequenceMatcher ratio (0-1)
|
# Cosine similarity threshold for word vectors (0-1).
|
||||||
|
# 0.65 catches morphological variants (parenting/parenthood) and synonyms
|
||||||
|
# (cannabis/marijuana) while avoiding unrelated words. Tuned for short blog tags.
|
||||||
|
SEMANTIC_THRESHOLD = 0.65
|
||||||
|
|
||||||
|
# Edit-distance threshold — only used as a typo catcher alongside semantics.
|
||||||
|
# 0.85 is very conservative: catches "kubernetse" vs "kubernetes" but not
|
||||||
|
# "dogs" vs "daily" (which scores ~0.40).
|
||||||
|
TYPO_THRESHOLD = 0.85
|
||||||
|
|
||||||
|
# Substring match: shorter tag must be at least this many chars
|
||||||
|
# and cover at least this fraction of the longer tag.
|
||||||
|
SUBSTRING_MIN_LEN = 5
|
||||||
|
SUBSTRING_MIN_RATIO = 0.6
|
||||||
|
|
||||||
|
# --- spaCy setup (lazy, with graceful fallback) ---
|
||||||
|
_nlp = None
|
||||||
|
_spacy_available = None
|
||||||
|
|
||||||
|
|
||||||
|
def _load_spacy():
|
||||||
|
"""Load spaCy model once. Returns (nlp, True) or (None, False)."""
|
||||||
|
global _nlp, _spacy_available
|
||||||
|
if _spacy_available is not None:
|
||||||
|
return _nlp, _spacy_available
|
||||||
|
try:
|
||||||
|
import spacy
|
||||||
|
|
||||||
|
_nlp = spacy.load("en_core_web_lg")
|
||||||
|
_spacy_available = True
|
||||||
|
except (ImportError, OSError) as e:
|
||||||
|
print(
|
||||||
|
f"{YELLOW}spaCy not available ({e}), "
|
||||||
|
f"falling back to edit-distance only{NC}"
|
||||||
|
)
|
||||||
|
_nlp = None
|
||||||
|
_spacy_available = False
|
||||||
|
return _nlp, _spacy_available
|
||||||
|
|
||||||
|
|
||||||
def extract_tags(filepath: Path, *, keep_blanks: bool = False) -> list[str]:
|
def extract_tags(filepath: Path, *, keep_blanks: bool = False) -> list[str]:
|
||||||
@ -74,26 +114,53 @@ def extract_tags(filepath: Path, *, keep_blanks: bool = False) -> list[str]:
|
|||||||
return [t for t in tags if t]
|
return [t for t in tags if t]
|
||||||
|
|
||||||
|
|
||||||
def find_similar(new_tag: str, existing_tags: set[str]) -> list[tuple[str, str]]:
|
def find_similar(
|
||||||
|
new_tag: str,
|
||||||
|
existing_tags: set[str],
|
||||||
|
existing_docs: dict | None = None,
|
||||||
|
) -> list[tuple[str, str]]:
|
||||||
"""Find existing tags similar to a new tag.
|
"""Find existing tags similar to a new tag.
|
||||||
|
|
||||||
|
Uses semantic similarity (spaCy vectors) as the primary check,
|
||||||
|
with edit-distance as a typo-catching backup.
|
||||||
|
|
||||||
|
If existing_docs is provided, it should be a dict mapping tag strings
|
||||||
|
to their pre-computed spaCy Doc objects (avoids redundant nlp() calls).
|
||||||
|
|
||||||
Returns list of (existing_tag, reason) tuples.
|
Returns list of (existing_tag, reason) tuples.
|
||||||
"""
|
"""
|
||||||
|
nlp, has_spacy = _load_spacy()
|
||||||
similar = []
|
similar = []
|
||||||
|
|
||||||
for existing in sorted(existing_tags):
|
for existing in sorted(existing_tags):
|
||||||
if existing == new_tag:
|
if existing == new_tag:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# Check substring match
|
# --- Check 1: Substring match (restricted) ---
|
||||||
if existing in new_tag or new_tag in existing:
|
shorter, longer = sorted([new_tag, existing], key=len)
|
||||||
|
if (
|
||||||
|
len(shorter) >= SUBSTRING_MIN_LEN
|
||||||
|
and shorter in longer
|
||||||
|
and len(shorter) / len(longer) >= SUBSTRING_MIN_RATIO
|
||||||
|
):
|
||||||
similar.append((existing, "substring match"))
|
similar.append((existing, "substring match"))
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# Check similarity ratio
|
# --- Check 2: Semantic similarity (primary) ---
|
||||||
|
if has_spacy:
|
||||||
|
doc_new = nlp(new_tag)
|
||||||
|
doc_ex = existing_docs[existing] if existing_docs else nlp(existing)
|
||||||
|
|
||||||
|
if doc_new.has_vector and doc_ex.has_vector:
|
||||||
|
score = doc_new.similarity(doc_ex)
|
||||||
|
if score >= SEMANTIC_THRESHOLD:
|
||||||
|
similar.append((existing, f"semantic: {score:.0%}"))
|
||||||
|
continue
|
||||||
|
|
||||||
|
# --- Check 3: Typo detection via edit distance (conservative) ---
|
||||||
ratio = SequenceMatcher(None, new_tag, existing).ratio()
|
ratio = SequenceMatcher(None, new_tag, existing).ratio()
|
||||||
if ratio >= SIMILARITY_THRESHOLD:
|
if ratio >= TYPO_THRESHOLD:
|
||||||
similar.append((existing, f"similarity: {ratio:.0%}"))
|
similar.append((existing, f"typo match: {ratio:.0%}"))
|
||||||
|
|
||||||
return similar
|
return similar
|
||||||
|
|
||||||
@ -130,8 +197,13 @@ def main() -> int:
|
|||||||
print(f"{GREEN}No existing tags found, nothing to compare against.{NC}")
|
print(f"{GREEN}No existing tags found, nothing to compare against.{NC}")
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
|
# Pre-compute spaCy docs for all existing tags (avoids repeated nlp() calls)
|
||||||
|
nlp, has_spacy = _load_spacy()
|
||||||
|
existing_docs = {tag: nlp(tag) for tag in all_tags} if has_spacy else None
|
||||||
|
|
||||||
# Check staged files for similar tags
|
# Check staged files for similar tags
|
||||||
found_issues = False
|
found_issues = False
|
||||||
|
start = time.monotonic()
|
||||||
|
|
||||||
for staged_file in staged_files:
|
for staged_file in staged_files:
|
||||||
filepath = repo_root / staged_file
|
filepath = repo_root / staged_file
|
||||||
@ -162,7 +234,7 @@ def main() -> int:
|
|||||||
continue
|
continue
|
||||||
|
|
||||||
# New tag — check for similarity
|
# New tag — check for similarity
|
||||||
similar = find_similar(tag, all_tags)
|
similar = find_similar(tag, all_tags, existing_docs)
|
||||||
|
|
||||||
if similar:
|
if similar:
|
||||||
found_issues = True
|
found_issues = True
|
||||||
@ -172,15 +244,18 @@ def main() -> int:
|
|||||||
for existing, reason in similar:
|
for existing, reason in similar:
|
||||||
print(f" {CYAN}\u2192 {existing} ({reason}){NC}")
|
print(f" {CYAN}\u2192 {existing} ({reason}){NC}")
|
||||||
|
|
||||||
|
elapsed = time.monotonic() - start
|
||||||
|
|
||||||
if found_issues:
|
if found_issues:
|
||||||
print()
|
print()
|
||||||
print(f"{RED}{BOLD}Tag similarity check failed.{NC}")
|
print(f"{RED}{BOLD}Tag similarity check failed.{NC}")
|
||||||
print(f"{RED}Consider using an existing tag, or skip with:{NC}")
|
print(f"{RED}Consider using an existing tag, or skip with:{NC}")
|
||||||
print(f"{RED} SKIP_TAG_CHECK=1 git commit{NC}")
|
print(f"{RED} SKIP_TAG_CHECK=1 git commit{NC}")
|
||||||
|
print(f"{RED} ({elapsed:.1f}s){NC}")
|
||||||
print()
|
print()
|
||||||
return 1
|
return 1
|
||||||
|
|
||||||
print(f"{GREEN}Tag check passed \u2014 no similar tags found.{NC}")
|
print(f"{GREEN}Tag check passed \u2014 no similar tags found. ({elapsed:.1f}s){NC}")
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
32
scripts/list-tags.py
Executable file
@ -0,0 +1,32 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""List all unique tags across Hugo content, sorted alphabetically."""
|
||||||
|
|
||||||
|
import re
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
content_dir = Path(__file__).resolve().parent.parent / "content"
|
||||||
|
tags: set[str] = set()
|
||||||
|
|
||||||
|
for md in content_dir.rglob("*.md"):
|
||||||
|
text = md.read_text(encoding="utf-8")
|
||||||
|
fm = re.match(r"^---\s*\n(.*?)\n---\s*\n", text, re.DOTALL)
|
||||||
|
if not fm:
|
||||||
|
continue
|
||||||
|
inline = re.search(r"^tags:\s*\[([^\]]*)\]", fm.group(1), re.MULTILINE)
|
||||||
|
if inline and inline.group(1).strip():
|
||||||
|
for t in inline.group(1).split(","):
|
||||||
|
t = t.strip().strip("\"'").lower()
|
||||||
|
if t:
|
||||||
|
tags.add(t)
|
||||||
|
else:
|
||||||
|
lm = re.search(
|
||||||
|
r"^tags:\s*\n((?:\s+-\s+.+\n?)+)", fm.group(1), re.MULTILINE
|
||||||
|
)
|
||||||
|
if lm:
|
||||||
|
for t in re.findall(r"^\s+-\s+(.*)", lm.group(1), re.MULTILINE):
|
||||||
|
t = t.strip().strip("\"'").lower()
|
||||||
|
if t:
|
||||||
|
tags.add(t)
|
||||||
|
|
||||||
|
for t in sorted(tags):
|
||||||
|
print(t)
|
||||||
738
scripts/optimize-images.sh
Executable file
@ -0,0 +1,738 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
# optimize-images.sh — Image auditor, metadata stripper, and WebP optimizer for fosscat.com
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# ./scripts/optimize-images.sh # Interactive mode
|
||||||
|
# ./scripts/optimize-images.sh --dry-run # Show what would happen without changing anything
|
||||||
|
# ./scripts/optimize-images.sh --yes # Skip all confirmation prompts
|
||||||
|
# ./scripts/optimize-images.sh --audit-only # Only run the audit phase (no changes)
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Configuration
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
IMAGES_DIR="static/images"
|
||||||
|
CONTENT_DIR="content"
|
||||||
|
CONFIG_FILE="config.toml"
|
||||||
|
MAX_WIDTH=2000
|
||||||
|
MAX_HEIGHT=2000
|
||||||
|
WEBP_QUALITY=82
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# CLI flags
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
DRY_RUN=false
|
||||||
|
AUTO_YES=false
|
||||||
|
AUDIT_ONLY=false
|
||||||
|
|
||||||
|
for arg in "$@"; do
|
||||||
|
case "$arg" in
|
||||||
|
--dry-run) DRY_RUN=true ;;
|
||||||
|
--yes|-y) AUTO_YES=true ;;
|
||||||
|
--audit-only) AUDIT_ONLY=true ;;
|
||||||
|
--help|-h)
|
||||||
|
echo "Usage: $0 [--dry-run] [--yes] [--audit-only]"
|
||||||
|
echo ""
|
||||||
|
echo " --dry-run Show what would happen without making changes"
|
||||||
|
echo " --yes, -y Skip confirmation prompts"
|
||||||
|
echo " --audit-only Only run the audit (no modifications)"
|
||||||
|
echo " --help, -h Show this help"
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "Unknown option: $arg"
|
||||||
|
echo "Run $0 --help for usage"
|
||||||
|
exit 1
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Colors and formatting
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
BLUE='\033[0;34m'
|
||||||
|
CYAN='\033[0;36m'
|
||||||
|
BOLD='\033[1m'
|
||||||
|
DIM='\033[2m'
|
||||||
|
NC='\033[0m' # No Color
|
||||||
|
|
||||||
|
info() { echo -e "${BLUE}[INFO]${NC} $*"; }
|
||||||
|
success() { echo -e "${GREEN}[OK]${NC} $*"; }
|
||||||
|
warn() { echo -e "${YELLOW}[WARN]${NC} $*"; }
|
||||||
|
error() { echo -e "${RED}[ERROR]${NC} $*"; }
|
||||||
|
header() { echo -e "\n${BOLD}${CYAN}═══ $* ═══${NC}\n"; }
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Dependency checks
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
check_deps() {
|
||||||
|
local missing=()
|
||||||
|
for cmd in exiftool convert identify cwebp; do
|
||||||
|
if ! command -v "$cmd" &>/dev/null; then
|
||||||
|
missing+=("$cmd")
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ ${#missing[@]} -gt 0 ]]; then
|
||||||
|
error "Missing required tools: ${missing[*]}"
|
||||||
|
echo " These are provided by the Nix dev shell. Run:"
|
||||||
|
echo " nix develop # or let direnv load the flake"
|
||||||
|
echo ""
|
||||||
|
echo " Required nix packages:"
|
||||||
|
echo " perl538Packages.ImageExifTool (exiftool)"
|
||||||
|
echo " imagemagick (convert, identify)"
|
||||||
|
echo " libwebp (cwebp)"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Ensure we're in the project root
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
if [[ ! -f "$CONFIG_FILE" ]] || [[ ! -d "$IMAGES_DIR" ]]; then
|
||||||
|
error "Must be run from the project root (where $CONFIG_FILE and $IMAGES_DIR exist)"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
check_deps
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Utility: human-readable file size
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
human_size() {
|
||||||
|
local bytes=$1
|
||||||
|
if (( bytes >= 1048576 )); then
|
||||||
|
local mb_whole=$(( bytes / 1048576 ))
|
||||||
|
local mb_frac=$(( (bytes % 1048576) * 10 / 1048576 ))
|
||||||
|
echo "${mb_whole}.${mb_frac} MB"
|
||||||
|
elif (( bytes >= 1024 )); then
|
||||||
|
echo "$(( bytes / 1024 )) KB"
|
||||||
|
else
|
||||||
|
echo "${bytes} B"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Utility: confirm prompt (respects --yes and --dry-run)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
confirm() {
|
||||||
|
local prompt="$1"
|
||||||
|
if $AUTO_YES; then
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
if $DRY_RUN; then
|
||||||
|
echo -e " ${DIM}(dry-run: would ask) $prompt${NC}"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
echo -en " $prompt ${BOLD}[y/N]${NC} "
|
||||||
|
read -r answer
|
||||||
|
[[ "$answer" =~ ^[Yy]$ ]]
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# PHASE 1: AUDIT
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
phase_audit() {
|
||||||
|
header "PHASE 1: IMAGE AUDIT"
|
||||||
|
|
||||||
|
# Collect all image files
|
||||||
|
local -a image_files=()
|
||||||
|
while IFS= read -r -d '' f; do
|
||||||
|
image_files+=("$f")
|
||||||
|
done < <(find "$IMAGES_DIR" -maxdepth 1 -type f \( -iname '*.jpg' -o -iname '*.jpeg' -o -iname '*.png' -o -iname '*.webp' -o -iname '*.gif' \) -print0 | sort -z)
|
||||||
|
|
||||||
|
if [[ ${#image_files[@]} -eq 0 ]]; then
|
||||||
|
warn "No images found in $IMAGES_DIR"
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Image inventory table ---
|
||||||
|
echo -e "${BOLD}Image Inventory${NC}"
|
||||||
|
printf " %-40s %-6s %-12s %s\n" "FILENAME" "FORMAT" "DIMENSIONS" "SIZE"
|
||||||
|
printf " %-40s %-6s %-12s %s\n" "--------" "------" "----------" "----"
|
||||||
|
|
||||||
|
local total_size=0
|
||||||
|
for img in "${image_files[@]}"; do
|
||||||
|
local fname
|
||||||
|
fname=$(basename "$img")
|
||||||
|
local ext="${fname##*.}"
|
||||||
|
local fsize
|
||||||
|
fsize=$(stat -c%s "$img" 2>/dev/null || stat -f%z "$img" 2>/dev/null)
|
||||||
|
total_size=$((total_size + fsize))
|
||||||
|
local dims
|
||||||
|
dims=$(identify -format "%wx%h" "$img" 2>/dev/null || echo "unknown")
|
||||||
|
printf " %-40s %-6s %-12s %s\n" "$fname" "$ext" "$dims" "$(human_size "$fsize")"
|
||||||
|
done
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
info "Total: ${#image_files[@]} images, $(human_size $total_size)"
|
||||||
|
|
||||||
|
# --- EXIF / Metadata scan ---
|
||||||
|
echo ""
|
||||||
|
echo -e "${BOLD}Metadata / Privacy Scan${NC}"
|
||||||
|
|
||||||
|
local privacy_issues=0
|
||||||
|
# Sensitive tag names to check (extracted in a single exiftool call per image)
|
||||||
|
local sensitive_tag_args=(
|
||||||
|
-GPSLatitude -GPSLongitude -GPSPosition
|
||||||
|
-SerialNumber -CameraSerialNumber -BodySerialNumber -LensSerialNumber
|
||||||
|
-OwnerName -Artist -Copyright -Creator -Rights
|
||||||
|
-By-line -Contact
|
||||||
|
-Make -Model -LensModel -Software
|
||||||
|
-DateTime -DateTimeOriginal -CreateDate
|
||||||
|
-CreatorTool -ImageDescription -UserComment
|
||||||
|
)
|
||||||
|
|
||||||
|
for img in "${image_files[@]}"; do
|
||||||
|
local fname
|
||||||
|
fname=$(basename "$img")
|
||||||
|
local has_metadata=false
|
||||||
|
local metadata_lines=()
|
||||||
|
|
||||||
|
# Single exiftool call to extract all sensitive tags at once
|
||||||
|
local exif_output
|
||||||
|
exif_output=$(exiftool -s -f "${sensitive_tag_args[@]}" "$img" 2>/dev/null || true)
|
||||||
|
|
||||||
|
while IFS= read -r line; do
|
||||||
|
[[ -z "$line" ]] && continue
|
||||||
|
# exiftool -s output format: "TagName : value"
|
||||||
|
local tagname value
|
||||||
|
tagname=$(echo "$line" | sed 's/\s*:.*//' | xargs)
|
||||||
|
value=$(echo "$line" | sed 's/^[^:]*:\s*//')
|
||||||
|
|
||||||
|
# Skip tags with no value (exiftool -f shows "-" for missing tags)
|
||||||
|
[[ "$value" == "-" ]] && continue
|
||||||
|
[[ -z "$value" ]] && continue
|
||||||
|
|
||||||
|
has_metadata=true
|
||||||
|
# Highlight GPS data in red
|
||||||
|
if [[ "$tagname" == *GPS* ]] || [[ "$tagname" == *Latitude* ]] || [[ "$tagname" == *Longitude* ]]; then
|
||||||
|
metadata_lines+=("${RED}!!${NC} $tagname: $value")
|
||||||
|
elif [[ "$tagname" == *Serial* ]] || [[ "$tagname" == *Owner* ]] || [[ "$tagname" == *Artist* ]] || [[ "$tagname" == *Creator* ]]; then
|
||||||
|
metadata_lines+=("${YELLOW}!${NC} $tagname: $value")
|
||||||
|
else
|
||||||
|
metadata_lines+=("${DIM}-${NC} $tagname: $value")
|
||||||
|
fi
|
||||||
|
done <<< "$exif_output"
|
||||||
|
|
||||||
|
if $has_metadata; then
|
||||||
|
privacy_issues=$((privacy_issues + 1))
|
||||||
|
echo -e " ${YELLOW}$fname${NC} — metadata found:"
|
||||||
|
for line in "${metadata_lines[@]}"; do
|
||||||
|
echo -e " $line"
|
||||||
|
done
|
||||||
|
else
|
||||||
|
echo -e " ${GREEN}$fname${NC} — clean"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
if [[ $privacy_issues -gt 0 ]]; then
|
||||||
|
warn "$privacy_issues image(s) contain metadata that should be stripped"
|
||||||
|
else
|
||||||
|
success "All images are clean of sensitive metadata"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Cross-reference with content ---
|
||||||
|
echo ""
|
||||||
|
echo -e "${BOLD}Content Reference Check${NC}"
|
||||||
|
|
||||||
|
# Collect all image references from content files
|
||||||
|
local -a referenced_images=()
|
||||||
|
local -a broken_refs=()
|
||||||
|
local -a inconsistent_paths=()
|
||||||
|
|
||||||
|
while IFS= read -r -d '' mdfile; do
|
||||||
|
# Front matter image field (handles both `image: "..."` and ` image: "..."` under cover:)
|
||||||
|
while IFS= read -r fm_image; do
|
||||||
|
[[ -z "$fm_image" ]] && continue
|
||||||
|
# Clean up: remove surrounding quotes and whitespace
|
||||||
|
fm_image=$(echo "$fm_image" | sed 's/^[[:space:]]*image:[[:space:]]*//' | sed 's/^["'\'']//' | sed 's/["'\'']\s*$//')
|
||||||
|
|
||||||
|
if [[ -n "$fm_image" ]] && [[ "$fm_image" != '""' ]] && [[ "$fm_image" != http* ]]; then
|
||||||
|
# Normalize: Hugo serves /images/... from static/images/...
|
||||||
|
local fs_path="static/${fm_image#/}"
|
||||||
|
|
||||||
|
# Check if it's a broken reference
|
||||||
|
if [[ ! -f "$fs_path" ]]; then
|
||||||
|
broken_refs+=("$mdfile|$fm_image")
|
||||||
|
else
|
||||||
|
referenced_images+=("$fs_path")
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check for inconsistent path (missing leading /)
|
||||||
|
if [[ "$fm_image" != /* ]]; then
|
||||||
|
inconsistent_paths+=("$mdfile|$fm_image")
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done < <(grep -E '^\s*image:\s' "$mdfile" 2>/dev/null || true)
|
||||||
|
|
||||||
|
# Inline markdown images: 
|
||||||
|
while IFS= read -r inline_ref; do
|
||||||
|
[[ -z "$inline_ref" ]] && continue
|
||||||
|
# Strip #fragment
|
||||||
|
local clean_ref="${inline_ref%%#*}"
|
||||||
|
local fs_ref="static/${clean_ref#/}"
|
||||||
|
|
||||||
|
if [[ ! -f "$fs_ref" ]] && [[ "$clean_ref" != http* ]]; then
|
||||||
|
broken_refs+=("$mdfile|$inline_ref")
|
||||||
|
else
|
||||||
|
referenced_images+=("$fs_ref")
|
||||||
|
fi
|
||||||
|
done < <(grep -oP '!\[[^\]]*\]\(\K[^)]+' "$mdfile" 2>/dev/null || true)
|
||||||
|
|
||||||
|
done < <(find "$CONTENT_DIR" -name '*.md' -print0)
|
||||||
|
|
||||||
|
# Also check config.toml for avatarUrl
|
||||||
|
local avatar_path
|
||||||
|
avatar_path=$(grep 'avatarUrl' "$CONFIG_FILE" | sed 's/.*=\s*["'\'']\(.*\)["'\'']/\1/' || true)
|
||||||
|
if [[ -n "$avatar_path" ]]; then
|
||||||
|
referenced_images+=("static/${avatar_path#/}")
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Find unreferenced images (compare using static/images/... paths)
|
||||||
|
local -a unreferenced=()
|
||||||
|
for img in "${image_files[@]}"; do
|
||||||
|
local found=false
|
||||||
|
for ref in "${referenced_images[@]}"; do
|
||||||
|
if [[ "$ref" == "$img" ]]; then
|
||||||
|
found=true
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
if ! $found; then
|
||||||
|
unreferenced+=("$img")
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# Report broken references
|
||||||
|
if [[ ${#broken_refs[@]} -gt 0 ]]; then
|
||||||
|
warn "${#broken_refs[@]} broken image reference(s):"
|
||||||
|
for entry in "${broken_refs[@]}"; do
|
||||||
|
local file="${entry%%|*}"
|
||||||
|
local ref="${entry##*|}"
|
||||||
|
echo -e " ${RED}$ref${NC} in ${DIM}$file${NC}"
|
||||||
|
done
|
||||||
|
else
|
||||||
|
success "No broken image references"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Report unreferenced images
|
||||||
|
echo ""
|
||||||
|
if [[ ${#unreferenced[@]} -gt 0 ]]; then
|
||||||
|
warn "${#unreferenced[@]} unreferenced image(s) (not used in any content):"
|
||||||
|
for img in "${unreferenced[@]}"; do
|
||||||
|
local fsize
|
||||||
|
fsize=$(stat -c%s "$img" 2>/dev/null || stat -f%z "$img" 2>/dev/null)
|
||||||
|
echo -e " ${YELLOW}$(basename "$img")${NC} ($(human_size "$fsize"))"
|
||||||
|
done
|
||||||
|
else
|
||||||
|
success "All images are referenced in content"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Report inconsistent paths
|
||||||
|
if [[ ${#inconsistent_paths[@]} -gt 0 ]]; then
|
||||||
|
echo ""
|
||||||
|
warn "${#inconsistent_paths[@]} image path(s) missing leading '/':"
|
||||||
|
for entry in "${inconsistent_paths[@]}"; do
|
||||||
|
local file="${entry%%|*}"
|
||||||
|
local ref="${entry##*|}"
|
||||||
|
echo -e " ${YELLOW}$ref${NC} in ${DIM}$file${NC}"
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Export arrays for later phases (bash 4+ trick: print to temp files)
|
||||||
|
printf '%s\n' "${image_files[@]}" > /tmp/optimg_files.txt
|
||||||
|
printf '%s\n' "${unreferenced[@]+"${unreferenced[@]}"}" > /tmp/optimg_unreferenced.txt
|
||||||
|
printf '%s\n' "${broken_refs[@]+"${broken_refs[@]}"}" > /tmp/optimg_broken.txt
|
||||||
|
echo "$total_size" > /tmp/optimg_total_size.txt
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# PHASE 2: METADATA STRIPPING
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
phase_strip_metadata() {
|
||||||
|
header "PHASE 2: METADATA STRIPPING"
|
||||||
|
|
||||||
|
if $DRY_RUN; then
|
||||||
|
info "(dry-run) Would strip all EXIF/IPTC/XMP metadata from images"
|
||||||
|
echo ""
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
|
||||||
|
local -a image_files=()
|
||||||
|
mapfile -t image_files < /tmp/optimg_files.txt
|
||||||
|
|
||||||
|
local stripped=0
|
||||||
|
for img in "${image_files[@]}"; do
|
||||||
|
[[ -z "$img" ]] && continue
|
||||||
|
local fname
|
||||||
|
fname=$(basename "$img")
|
||||||
|
|
||||||
|
# Check if image has strippable EXIF/XMP/IPTC metadata (not just file properties)
|
||||||
|
# Use -EXIF:All -XMP:All -IPTC:All to only check real metadata groups
|
||||||
|
local meta_check
|
||||||
|
meta_check=$(exiftool -s -s -s -EXIF:All -XMP:All -IPTC:All "$img" 2>/dev/null || true)
|
||||||
|
|
||||||
|
if [[ -z "$meta_check" ]]; then
|
||||||
|
echo -e " ${DIM}$fname — already clean, skipping${NC}"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Auto-orient JPEG/PNG before stripping (applies EXIF rotation to pixels)
|
||||||
|
local ext="${fname##*.}"
|
||||||
|
ext=$(echo "$ext" | tr '[:upper:]' '[:lower:]')
|
||||||
|
if [[ "$ext" == "jpg" ]] || [[ "$ext" == "jpeg" ]] || [[ "$ext" == "png" ]]; then
|
||||||
|
magick "$img" -auto-orient "$img" 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Strip all metadata
|
||||||
|
exiftool -all= -overwrite_original "$img" 2>/dev/null
|
||||||
|
stripped=$((stripped + 1))
|
||||||
|
echo -e " ${GREEN}$fname${NC} — metadata stripped"
|
||||||
|
done
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
success "Stripped metadata from $stripped image(s)"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# PHASE 3: CONVERT & COMPRESS
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
phase_convert() {
|
||||||
|
header "PHASE 3: CONVERT TO WEBP & COMPRESS"
|
||||||
|
|
||||||
|
local -a image_files=()
|
||||||
|
mapfile -t image_files < /tmp/optimg_files.txt
|
||||||
|
|
||||||
|
# Delete unreferenced images first
|
||||||
|
local -a unreferenced=()
|
||||||
|
mapfile -t unreferenced < /tmp/optimg_unreferenced.txt
|
||||||
|
|
||||||
|
if [[ ${#unreferenced[@]} -gt 0 ]] && [[ -n "${unreferenced[0]}" ]]; then
|
||||||
|
echo -e "${BOLD}Removing unreferenced images${NC}"
|
||||||
|
for img in "${unreferenced[@]}"; do
|
||||||
|
[[ -z "$img" ]] && continue
|
||||||
|
local fsize
|
||||||
|
fsize=$(stat -c%s "$img" 2>/dev/null || stat -f%z "$img" 2>/dev/null)
|
||||||
|
if $DRY_RUN; then
|
||||||
|
echo -e " ${DIM}(dry-run) Would delete: $(basename "$img") ($(human_size "$fsize"))${NC}"
|
||||||
|
else
|
||||||
|
rm -f "$img"
|
||||||
|
echo -e " ${RED}Deleted:${NC} $(basename "$img") ($(human_size "$fsize"))"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
echo ""
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${BOLD}Converting images to WebP (quality $WEBP_QUALITY, max ${MAX_WIDTH}x${MAX_HEIGHT})${NC}"
|
||||||
|
printf " %-40s %-12s %-12s %s\n" "FILENAME" "BEFORE" "AFTER" "SAVINGS"
|
||||||
|
printf " %-40s %-12s %-12s %s\n" "--------" "------" "-----" "-------"
|
||||||
|
|
||||||
|
local total_before=0
|
||||||
|
local total_after=0
|
||||||
|
local converted=0
|
||||||
|
|
||||||
|
for img in "${image_files[@]}"; do
|
||||||
|
[[ -z "$img" ]] && continue
|
||||||
|
# Skip if this was an unreferenced file we just deleted
|
||||||
|
[[ ! -f "$img" ]] && continue
|
||||||
|
|
||||||
|
local fname
|
||||||
|
fname=$(basename "$img")
|
||||||
|
local ext="${fname##*.}"
|
||||||
|
local base="${fname%.*}"
|
||||||
|
ext_lower=$(echo "$ext" | tr '[:upper:]' '[:lower:]')
|
||||||
|
local webp_path="$IMAGES_DIR/${base}.webp"
|
||||||
|
|
||||||
|
local before_size
|
||||||
|
before_size=$(stat -c%s "$img" 2>/dev/null || stat -f%z "$img" 2>/dev/null)
|
||||||
|
total_before=$((total_before + before_size))
|
||||||
|
|
||||||
|
if $DRY_RUN; then
|
||||||
|
echo -e " ${DIM}(dry-run) Would convert: $fname -> ${base}.webp${NC}"
|
||||||
|
# Estimate: assume 80% reduction for JPEGs, 70% for PNGs, 10% for existing WebP
|
||||||
|
local est_after=$before_size
|
||||||
|
case "$ext_lower" in
|
||||||
|
jpg|jpeg) est_after=$((before_size / 5)) ;;
|
||||||
|
png) est_after=$((before_size / 3)) ;;
|
||||||
|
webp) est_after=$((before_size * 9 / 10)) ;;
|
||||||
|
esac
|
||||||
|
total_after=$((total_after + est_after))
|
||||||
|
converted=$((converted + 1))
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Get current dimensions
|
||||||
|
local cur_width cur_height
|
||||||
|
read -r cur_width cur_height < <(identify -format "%w %h\n" "$img" 2>/dev/null || echo "0 0")
|
||||||
|
|
||||||
|
local needs_resize=false
|
||||||
|
if (( cur_width > MAX_WIDTH )) || (( cur_height > MAX_HEIGHT )); then
|
||||||
|
needs_resize=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Determine the input for cwebp
|
||||||
|
local cwebp_input="$img"
|
||||||
|
local tmp_resized=""
|
||||||
|
|
||||||
|
if $needs_resize; then
|
||||||
|
# Resize via ImageMagick, output to temp PNG for cwebp
|
||||||
|
tmp_resized=$(mktemp /tmp/optimg_XXXXXX.png)
|
||||||
|
magick "$img" -resize "${MAX_WIDTH}x${MAX_HEIGHT}>" -quality 100 "$tmp_resized"
|
||||||
|
info " Resized $fname: ${cur_width}x${cur_height} -> $(magick identify -format '%wx%h' "$tmp_resized")"
|
||||||
|
cwebp_input="$tmp_resized"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Convert to WebP via cwebp (handles JPEG/PNG/WebP input natively)
|
||||||
|
if [[ "$ext_lower" == "webp" ]] && [[ "$img" == "$webp_path" ]]; then
|
||||||
|
# Same input and output: use temp output
|
||||||
|
local tmp_webp
|
||||||
|
tmp_webp=$(mktemp /tmp/optimg_XXXXXX.webp)
|
||||||
|
cwebp -q "$WEBP_QUALITY" "$cwebp_input" -o "$tmp_webp" 2>/dev/null
|
||||||
|
mv "$tmp_webp" "$webp_path"
|
||||||
|
else
|
||||||
|
cwebp -q "$WEBP_QUALITY" "$cwebp_input" -o "$webp_path" 2>/dev/null
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Cleanup temp file if we resized
|
||||||
|
[[ -n "$tmp_resized" ]] && rm -f "$tmp_resized"
|
||||||
|
|
||||||
|
# Step 3: Delete original if it's not already .webp
|
||||||
|
if [[ "$ext_lower" != "webp" ]]; then
|
||||||
|
rm -f "$img"
|
||||||
|
fi
|
||||||
|
|
||||||
|
local after_size
|
||||||
|
after_size=$(stat -c%s "$webp_path" 2>/dev/null || stat -f%z "$webp_path" 2>/dev/null)
|
||||||
|
total_after=$((total_after + after_size))
|
||||||
|
|
||||||
|
local savings=0
|
||||||
|
if (( before_size > 0 )); then
|
||||||
|
savings=$(( (before_size - after_size) * 100 / before_size ))
|
||||||
|
fi
|
||||||
|
|
||||||
|
local savings_color="$GREEN"
|
||||||
|
if (( savings < 10 )); then
|
||||||
|
savings_color="$YELLOW"
|
||||||
|
fi
|
||||||
|
|
||||||
|
printf " %-40s %-12s %-12s ${savings_color}%s%%${NC}\n" \
|
||||||
|
"${base}.webp" "$(human_size "$before_size")" "$(human_size "$after_size")" "$savings"
|
||||||
|
|
||||||
|
converted=$((converted + 1))
|
||||||
|
done
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
local total_savings=0
|
||||||
|
if (( total_before > 0 )); then
|
||||||
|
total_savings=$(( (total_before - total_after) * 100 / total_before ))
|
||||||
|
fi
|
||||||
|
info "Converted $converted image(s)"
|
||||||
|
info "Total: $(human_size $total_before) -> $(human_size $total_after) (${total_savings}% reduction)"
|
||||||
|
|
||||||
|
# Save totals for summary
|
||||||
|
echo "$total_before" > /tmp/optimg_total_before.txt
|
||||||
|
echo "$total_after" > /tmp/optimg_total_after.txt
|
||||||
|
echo "$converted" > /tmp/optimg_converted.txt
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# PHASE 4: UPDATE CONTENT REFERENCES
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
phase_update_refs() {
|
||||||
|
header "PHASE 4: UPDATE CONTENT REFERENCES"
|
||||||
|
|
||||||
|
local updated_files=0
|
||||||
|
|
||||||
|
# --- Step 1: Update image extensions in content files ---
|
||||||
|
# This must happen BEFORE broken ref clearing, since .jpg/.png files are now .webp
|
||||||
|
echo -e "${BOLD}Updating image references (.jpg/.jpeg/.png -> .webp)${NC}"
|
||||||
|
|
||||||
|
while IFS= read -r -d '' mdfile; do
|
||||||
|
local changed=false
|
||||||
|
|
||||||
|
# Normalize front matter paths first: change image: "images/... to image: "/images/...
|
||||||
|
if grep -qE '^\s*image:\s*"images/' "$mdfile" 2>/dev/null; then
|
||||||
|
if ! $DRY_RUN; then
|
||||||
|
sed -i -E 's@^(\s*image:\s*)"images/@\1"/images/@' "$mdfile"
|
||||||
|
fi
|
||||||
|
changed=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Update front matter image field (only local paths, not http URLs)
|
||||||
|
# Handles both `image: "/images/..."` and ` image: "/images/..."` (indented under cover:)
|
||||||
|
if grep -qE '^\s*image:\s*"/images/.*\.(jpg|jpeg|JPG|JPEG|png|PNG)"' "$mdfile" 2>/dev/null; then
|
||||||
|
if ! $DRY_RUN; then
|
||||||
|
sed -i -E 's@^(\s*image:\s*"/images/[^"]*)\.(jpg|jpeg|JPG|JPEG|png|PNG)"@\1.webp"@' "$mdfile"
|
||||||
|
fi
|
||||||
|
changed=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Update inline markdown images: 
|
||||||
|
# Only match local /images/ paths, not external URLs
|
||||||
|
if grep -qP '!\[[^\]]*\]\(/images/[^)]*\.(jpg|jpeg|JPG|JPEG|png|PNG)(#[^)]*)?\)' "$mdfile" 2>/dev/null; then
|
||||||
|
if ! $DRY_RUN; then
|
||||||
|
sed -i -E 's@(!\[[^]]*\]\(/images/[^.)]*)\.(jpg|jpeg|JPG|JPEG|png|PNG)([#][^)]*)?(\))@\1.webp\3\4@g' "$mdfile"
|
||||||
|
fi
|
||||||
|
changed=true
|
||||||
|
fi
|
||||||
|
|
||||||
|
if $changed; then
|
||||||
|
local relpath="${mdfile}"
|
||||||
|
if $DRY_RUN; then
|
||||||
|
echo -e " ${DIM}(dry-run) Would update refs in: $relpath${NC}"
|
||||||
|
else
|
||||||
|
echo -e " ${GREEN}Updated${NC} $relpath"
|
||||||
|
fi
|
||||||
|
updated_files=$((updated_files + 1))
|
||||||
|
fi
|
||||||
|
done < <(find "$CONTENT_DIR" -name '*.md' -print0)
|
||||||
|
|
||||||
|
# --- Step 2: Update config.toml avatar ---
|
||||||
|
if grep -q 'avatarUrl.*\.png' "$CONFIG_FILE" 2>/dev/null; then
|
||||||
|
if $DRY_RUN; then
|
||||||
|
echo -e " ${DIM}(dry-run) Would update avatarUrl in $CONFIG_FILE${NC}"
|
||||||
|
else
|
||||||
|
sed -i 's@avatarUrl = "/images/fosscat_icon\.png"@avatarUrl = "/images/fosscat_icon.webp"@' "$CONFIG_FILE"
|
||||||
|
echo -e " ${GREEN}Updated${NC} avatarUrl in $CONFIG_FILE"
|
||||||
|
fi
|
||||||
|
updated_files=$((updated_files + 1))
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Step 3: Clear genuinely broken image references ---
|
||||||
|
# Only clear refs that still don't resolve after extension updates
|
||||||
|
# (e.g., placeholder /images/img.jpg that was never a real image)
|
||||||
|
echo ""
|
||||||
|
echo -e "${BOLD}Checking for remaining broken image references${NC}"
|
||||||
|
|
||||||
|
local cleared=0
|
||||||
|
while IFS= read -r -d '' mdfile; do
|
||||||
|
# Check front matter image fields
|
||||||
|
while IFS= read -r fm_line; do
|
||||||
|
[[ -z "$fm_line" ]] && continue
|
||||||
|
local fm_image
|
||||||
|
fm_image=$(echo "$fm_line" | sed 's/^[[:space:]]*image:[[:space:]]*//' | sed 's/^["'\'']//' | sed 's/["'\'']\s*$//')
|
||||||
|
|
||||||
|
[[ -z "$fm_image" ]] && continue
|
||||||
|
[[ "$fm_image" == '""' ]] && continue
|
||||||
|
[[ "$fm_image" == http* ]] && continue
|
||||||
|
|
||||||
|
local fs_path="static/${fm_image#/}"
|
||||||
|
if [[ ! -f "$fs_path" ]]; then
|
||||||
|
if $DRY_RUN; then
|
||||||
|
echo -e " ${DIM}(dry-run) Would clear broken ref in: $mdfile (was: $fm_image)${NC}"
|
||||||
|
else
|
||||||
|
local escaped_image
|
||||||
|
escaped_image=$(echo "$fm_image" | sed 's/[.[\/*^$]/\\&/g')
|
||||||
|
sed -i -E "s@^(\s*image:\s*).*${escaped_image}.*@\1\"\"@" "$mdfile"
|
||||||
|
echo -e " ${GREEN}Cleared${NC} broken ref ${DIM}$fm_image${NC} in ${DIM}$mdfile${NC}"
|
||||||
|
cleared=$((cleared + 1))
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done < <(grep -E '^\s*image:\s' "$mdfile" 2>/dev/null || true)
|
||||||
|
done < <(find "$CONTENT_DIR" -name '*.md' -print0)
|
||||||
|
|
||||||
|
if [[ $cleared -eq 0 ]] && ! $DRY_RUN; then
|
||||||
|
success "No broken image references remaining"
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
info "Updated $updated_files file(s)"
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# PHASE 5: SUMMARY
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
phase_summary() {
|
||||||
|
header "PHASE 5: SUMMARY"
|
||||||
|
|
||||||
|
if $DRY_RUN; then
|
||||||
|
echo -e "${BOLD}${YELLOW}DRY RUN — no changes were made${NC}"
|
||||||
|
echo ""
|
||||||
|
fi
|
||||||
|
|
||||||
|
local total_before total_after converted
|
||||||
|
total_before=$(cat /tmp/optimg_total_before.txt 2>/dev/null || cat /tmp/optimg_total_size.txt 2>/dev/null || echo 0)
|
||||||
|
total_after=$(cat /tmp/optimg_total_after.txt 2>/dev/null || echo 0)
|
||||||
|
converted=$(cat /tmp/optimg_converted.txt 2>/dev/null || echo 0)
|
||||||
|
|
||||||
|
local savings=0
|
||||||
|
if (( total_before > 0 )) && (( total_after > 0 )); then
|
||||||
|
savings=$(( (total_before - total_after) * 100 / total_before ))
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e " Images processed: ${BOLD}$converted${NC}"
|
||||||
|
if (( total_after > 0 )); then
|
||||||
|
echo -e " Size before: ${BOLD}$(human_size "$total_before")${NC}"
|
||||||
|
echo -e " Size after: ${BOLD}$(human_size "$total_after")${NC}"
|
||||||
|
echo -e " Total reduction: ${BOLD}${GREEN}${savings}%${NC}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo -e " ${BOLD}Next steps:${NC}"
|
||||||
|
echo -e " 1. Run ${CYAN}hugo server${NC} and verify images look correct"
|
||||||
|
echo -e " 2. Check the browser dev tools Network tab for proper WebP delivery"
|
||||||
|
echo -e " 3. Commit when satisfied: ${CYAN}git add -A && git commit -m \"optimize: convert images to webp, strip metadata\"${NC}"
|
||||||
|
|
||||||
|
# Cleanup temp files
|
||||||
|
rm -f /tmp/optimg_*.txt
|
||||||
|
}
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# MAIN
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
main() {
|
||||||
|
echo -e "${BOLD}${CYAN}"
|
||||||
|
echo " ┌─────────────────────────────────────────┐"
|
||||||
|
echo " │ fosscat.com Image Optimizer │"
|
||||||
|
echo " │ Strip metadata · Convert to WebP │"
|
||||||
|
echo " │ Resize · Audit references │"
|
||||||
|
echo " └─────────────────────────────────────────┘"
|
||||||
|
echo -e "${NC}"
|
||||||
|
|
||||||
|
if $DRY_RUN; then
|
||||||
|
echo -e " ${YELLOW}Running in DRY RUN mode — no files will be modified${NC}"
|
||||||
|
echo ""
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Phase 1: Audit (always runs)
|
||||||
|
phase_audit
|
||||||
|
|
||||||
|
if $AUDIT_ONLY; then
|
||||||
|
echo ""
|
||||||
|
info "Audit complete. Run without --audit-only to process images."
|
||||||
|
rm -f /tmp/optimg_*.txt
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Confirm before proceeding
|
||||||
|
echo ""
|
||||||
|
if ! $AUTO_YES && ! $DRY_RUN; then
|
||||||
|
echo -en " ${BOLD}Proceed with optimization? [y/N]${NC} "
|
||||||
|
read -r answer
|
||||||
|
if [[ ! "$answer" =~ ^[Yy]$ ]]; then
|
||||||
|
info "Aborted."
|
||||||
|
rm -f /tmp/optimg_*.txt
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Phase 2: Strip metadata
|
||||||
|
phase_strip_metadata
|
||||||
|
|
||||||
|
# Phase 3: Convert & compress
|
||||||
|
phase_convert
|
||||||
|
|
||||||
|
# Phase 4: Update references
|
||||||
|
phase_update_refs
|
||||||
|
|
||||||
|
# Phase 5: Summary
|
||||||
|
phase_summary
|
||||||
|
}
|
||||||
|
|
||||||
|
main
|
||||||
|
Before Width: | Height: | Size: 320 KiB |
BIN
static/images/arch-logo.webp
Normal file
|
After Width: | Height: | Size: 5.7 KiB |
|
Before Width: | Height: | Size: 274 KiB |
BIN
static/images/eagle-wizard.webp
Normal file
|
After Width: | Height: | Size: 69 KiB |
|
Before Width: | Height: | Size: 1.1 MiB |
BIN
static/images/elephant-and-blind-men.webp
Normal file
|
After Width: | Height: | Size: 56 KiB |
|
Before Width: | Height: | Size: 240 KiB |
BIN
static/images/fosscat_icon.webp
Normal file
|
After Width: | Height: | Size: 26 KiB |
|
Before Width: | Height: | Size: 380 KiB After Width: | Height: | Size: 186 KiB |
|
Before Width: | Height: | Size: 104 KiB |
BIN
static/images/hammock.webp
Normal file
|
After Width: | Height: | Size: 36 KiB |
|
Before Width: | Height: | Size: 3.5 MiB |
BIN
static/images/japan-arrival.webp
Normal file
|
After Width: | Height: | Size: 93 KiB |
|
Before Width: | Height: | Size: 3.9 MiB |
BIN
static/images/japan-fuji-town.webp
Normal file
|
After Width: | Height: | Size: 130 KiB |
|
Before Width: | Height: | Size: 4.1 MiB |
BIN
static/images/japan-fuji.webp
Normal file
|
After Width: | Height: | Size: 163 KiB |
|
Before Width: | Height: | Size: 2.3 MiB |
BIN
static/images/japan-mcdonalds.webp
Normal file
|
After Width: | Height: | Size: 217 KiB |
|
Before Width: | Height: | Size: 9.2 MiB |
BIN
static/images/japan-nekomachi.webp
Normal file
|
After Width: | Height: | Size: 767 KiB |
|
Before Width: | Height: | Size: 6.2 MiB |
BIN
static/images/japan-nippori-graveyard.webp
Normal file
|
After Width: | Height: | Size: 445 KiB |
|
Before Width: | Height: | Size: 6.5 MiB |
BIN
static/images/japan-nippori-walk.webp
Normal file
|
After Width: | Height: | Size: 450 KiB |
|
Before Width: | Height: | Size: 5.0 MiB |
BIN
static/images/japan-noodle-cup.webp
Normal file
|
After Width: | Height: | Size: 157 KiB |
|
Before Width: | Height: | Size: 5.6 MiB |
|
Before Width: | Height: | Size: 3.5 MiB |
BIN
static/images/japan-shinjuku.webp
Normal file
|
After Width: | Height: | Size: 391 KiB |
|
Before Width: | Height: | Size: 7.8 MiB |
BIN
static/images/japan-shinjuzu-garden.webp
Normal file
|
After Width: | Height: | Size: 599 KiB |
|
Before Width: | Height: | Size: 6.0 MiB |
|
Before Width: | Height: | Size: 7.2 MiB |
BIN
static/images/japan-spirit-tree.webp
Normal file
|
After Width: | Height: | Size: 533 KiB |
|
Before Width: | Height: | Size: 870 KiB |
BIN
static/images/monochrome-path.webp
Normal file
|
After Width: | Height: | Size: 526 KiB |
|
Before Width: | Height: | Size: 121 KiB |
BIN
static/images/nginx-mumble.webp
Normal file
|
After Width: | Height: | Size: 49 KiB |
|
Before Width: | Height: | Size: 504 KiB |
BIN
static/images/ocean-aerial.webp
Normal file
|
After Width: | Height: | Size: 451 KiB |
|
Before Width: | Height: | Size: 1.1 MiB |
|
Before Width: | Height: | Size: 126 KiB After Width: | Height: | Size: 114 KiB |
|
Before Width: | Height: | Size: 3.7 MiB |
BIN
static/images/otto-on-nature-path-algorithm.webp
Normal file
|
After Width: | Height: | Size: 1.3 MiB |
|
Before Width: | Height: | Size: 678 KiB |
BIN
static/images/rpi-bookshelf.webp
Normal file
|
After Width: | Height: | Size: 113 KiB |
BIN
static/images/salt-lake-basin-view.webp
Normal file
|
After Width: | Height: | Size: 459 KiB |