Jupiter

social.alexschroeder.ch

Profile for Alex Schroeder. Username @alex, social.alexschroeder.ch. Role: admin

About

Biology, Plants, Birds, Photography, Switzerland, Emacs, Wiki, Gemini, Programming, Tea, Drawing, Music. Moving over from kensanata@octodon.social, slowly.
Languages: gsw de en fr pt.
He/him.

Joined on Jun, 2022. 1929 posts. Followed by 336. Following 292.

Recent posts

Alex Schroeder . @alex,

I use something like the following for my webserver to try and deny all bots, spiders and crawlers access to my site. This is the second level of defense. Defense in depth is good. Level 1 is robots.txt; Level 2 is user agent filtering; Level 3 is fail2ban monitoring the access log and banning anybody who requests stuff faster than I think people can read.

Anyway: user agent filter. These three conditions all need to be true. The first condition makes sure only the sites listed are affected (because some sites are exempt…). The second condition makes an exception for Archive Bot and Gwene. The third condition filters for all self-identified bots, crawlers and spiders. The actual rule tells them that the page is gone and should be deleted (status 410), and it also adds a Location header just in case a human is curious which leads them to https://alexschroeder.ch/nobots.

The reason is this: For a while it seemed that we all benefited from search engines – authors and readers both. These days, you'll find that search results are full of garbage sites. Big sites with the most flatulent of pages explaining in great detail why the thing you're looking for is important and how to do it, clearly optimized for an ad company and not for a reader. Big sites that have a gazillion answers are preferred over small and individual sites. Perhaps that's easier. Perhaps it allows them to diffuse responsibility for the garbage, I don't know. The effect is, in any case, that there is no benefit to search engines for small site authors, either. I was unable to find my own pages on the search engines. If you you are a small site owner and you think you can find your own pages on Google and Bing, I suspect that's because they track you. Try it on a different computer, anonymously. Perhaps you won't find yourself, either.

In any case, if I can't get anything in return, both as a reader and as an author, I feel that the deal is off. Why let them feed on my words for free? Nay, at a cost, since they are keeping my website busy, producing CO₂ and heating the planet for no benefit at all.

Better to block them all.

RewriteCond "%{HTTP_HOST}" "^(alexschroeder\.ch|…)$" [nocase]
RewriteCond "%{HTTP_USER_AGENT}" "!archivebot|^gwene" [nocase]
RewriteCond "%{HTTP_USER_AGENT}" "bot|crawler|spider" [nocase]
RewriteRule ^ https://alexschroeder.ch/nobots [redirect=410,last]
Open thread
Alex Schroeder . @alex,

The feeling when Reddit doesn't show more than a few lines on Firefox because I'm blocking stuff, then Reddit doesn't show anything when I use the Emacs web browser, then I read the page at last using w3m in a terminal emulator and I'm already angry again.

Open thread
Alex Schroeder . @alex,

I clicked a link to a shop and it had a big popup saying “you have been selected…” and my fingers immediately closed the site without allowing my brain to finish processing. I guess it was something about a redesign. This is my reality on the web. Always be closing!

Open thread
Alex Schroeder . @alex,

I hate it when I am writing a blog post, switch away from the browser for a second, and then it updates. And the text I was writing is gone. I don’t like this. Not one bit. Do I need to use a Markdown editor because I can no longer trust my browser to not restart in mid-editing? 😩

Open thread
Alex Schroeder . @alex,

This is me thinking about #oddmu usability; Oddmu is an opinionated (?) wiki written in Go, see https://alexschroeder.ch/view/Odd%c2%b5

I recently uploaded 20 images and posted some text to my website and the experience wasn’t bad at all. Perhaps I will add some tweaking such as redirecting to the same upload page, incrementing the last number of the file name, and showing the last upload, something like that. This would have helped.

Another feature I need is deletion of uploaded files. Right now only pages can be deleted. The inability to undo typos in file names when uploading is a big unhappy place, for me.

Open thread
Alex Schroeder . @alex,

When I migrated a few thousand blog pages from Oddmuse to Oddmu (two wikis I maintain) I had to move from a weird, personal, eclectic markup to Markdown as it is parsed and rendered by a particular parser I’m using. As it turned out, the solution was to use the old HTML and convert that to Markdown. That worked well enough for me. Incrementally.

https://alexschroeder.ch/view/2023-09-12-oddmuse-to-oddmu

Open thread
Alex Schroeder . @alex,

Reading Lower Standards by Richard P. Gabriel, about writing a poem a day. Or doing anything, often, instead of revising endlessly.
«On March 18, 2000 I began writing a poem a day. Today is June 19, 2001, and I have 428 poems written—I skip a day or two every month. What is it like to do this? How much does William Stafford’s advice to “lower standards” help? How does one revise in this regimen? What techniques can one use to gather material to “write about”? What does it cost to work this way? What can you gain once you’ve made the work less precious or special? Is any of it publishable? Quality “versus” quantity? This little class will start with some observations I’ve made and continue with a discussion. Resistances will surely come up. And fear.»
https://dreamsongs.com/Files/PoemADay.pdf
Via @salamandar

Open thread