Due to Hurricane Ian and my locality, I decided to turn on the local news to see what the forecasters are saying. There was a good bit and then a moment came on about PSAs. It was a screencap/screencast of a local county's messaging on Twitter. Twitter. I promptly looked away, realizing that the news is definitely a joke and I need to stick to focusing on individual journalism.

byhttps://jacky.wtf • posted archived copycurrent

I'm definitely jaded. The last four years of reading has made me more hungry for sources. News outlets have to be "primary" ones, but they tend to lean in favor of reinforcing things that aren't neutral (hold for traffic and news - even sports is limited because they don't talk about minor leagues outside the big 3). Maybe TV was the problem all along, lol.

Due to Hurricane Ian and my locality, I decided to turn on the local news to see what the forecasters are saying. There was a good bit and then a moment came on about PSAs. It was a screencap/screencast of a local county's messaging on Twitter. Twitter. I promptly looked away, realizing that the news is definitely a joke and I need to stick to focusing on individual journalism.

A great use-case for decentralized social networking: during a tropical storm. Even if something local could be used as a beacon, that would be helpful for finding people. Maybe less ambitiously would be allowing people to talk to one another while they shelter in place. I don't think WiFi travels too far so this is assuming a lot.

Well. I'm just now realizing that I don't have money to pay for my Web server next month. I'm already a bit behind. So there's a good chance it'll be deleted. And my domains are due. Lol, it's more expensive to be online than I though.

Hot take: all ActivityPub servers that make use of OAuth2 must expose that info in their Actor response bodies. Dramatically simplifies the act of getting a token when coupled with OAuth Server Metadata implemented as well. If Mastodon (and other popular services) have done this, it could have made the act of implementing clients implementation-independent (because you have to figure out authentication as part of the development process).

One of my pet projects is to eventually stop scrobbling to last.fm and keep it either on some local database or push it to something I can use self-host. Mainly to keep my listening history to myself because I don't really use discovery algorithms (it doesn't seem to be that effective if you mix behind trending and non-trending music).

I mentioned a while ago about how I mistook the leading games as an indicator of the kind of games people want — which is always incorrect. People don't even know that they've been monitored in the things they're using every day, so how can any "metric" be authentic of their behavior? This just motivates me to work with people who care about people to not feed into the excessively normalized state of violence that's randomly been deemed okay.

byhttps://jacky.wtf • posted archived copycurrent

I love the argument: "but humans have always been violent". That's further from the truth, and we really need to start getting people to either read or actually learn non-McGraw Hill approved textbooks. They barely talk about the Civil Rights movement and will have spreads about the "glamour" of killing indigenous people; why would that be an accurate representation of the state of humanity?

This comes to mind with the recent uptick (from my eyes, YMMV) of the amount of violence by the perpetrators of violence is made into media for everyone to glamorize. The money could have been given to the families harmed but instead, it's made into something to clamor for an award. It's so sick.

byhttps://jacky.wtf • posted archived copycurrent

I mentioned a while ago about how I mistook the leading games as an indicator of the kind of games people want — which is always incorrect. People don't even know that they've been monitored in the things they're using every day, so how can any "metric" be authentic of their behavior? This just motivates me to work with people who care about people to not feed into the excessively normalized state of violence that's randomly been deemed okay.

There's obviously a die-hard community around US military reenactment games. It's also wrong to say it's not used as a recruitment tool, (via https://gamerant.com/call-duty-modern-warfare-recruitment-tool/. Like there's literal people from the US Department of Defense involved in the making of these games.

byhttps://jacky.wtf • posted archived copycurrent

This comes to mind with the recent uptick (from my eyes, YMMV) of the amount of violence by the perpetrators of violence is made into media for everyone to glamorize. The money could have been given to the families harmed but instead, it's made into something to clamor for an award. It's so sick.

I've been thinking a lot about games (and watching documentaries, videos) and it's more likely that there's people that learned about WWII and the Cold War mainly from Call of Duty and other games than like from actual history. There has to be a line about reenactments of actual events since these things are still being used as leverage in discussions (and these people grow up to be "voting members" of society or influence people who tend end up voting).

byhttps://jacky.wtf • posted archived copycurrent

There's obviously a die-hard community around US military reenactment games. It's also wrong to say it's not used as a recruitment tool, (via https://gamerant.com/call-duty-modern-warfare-recruitment-tool/. Like there's literal people from the US Department of Defense involved in the making of these games.

I've been thinking a lot about games (and watching documentaries, videos) and it's more likely that there's people that learned about WWII and the Cold War mainly from Call of Duty and other games than like from actual history. There has to be a line about reenactments of actual events since these things are still being used as leverage in discussions (and these people grow up to be "voting members" of society or influence people who tend end up voting).

Owning My Scrobbles

Some eight years ago, I stumbled upon an irreparable ’50s radio and turned it into a Pi-powered audio player.


That is, I gave it a fresh coat of paint and a new grill cloth, and built in two small-ish speakers and an equally small “Class D” amp. The thing’s driven by a Raspberry Pi equipped with the cheapest of USB “sound cards.”


The Pi runs MPD, and, alongside it, mpdscribble, a simple audio scrobbling service. Had it push song data to Last.fm—which I hadn’t used in years—for quite a while. And while that was fun, I ultimately decided to pull the plug. (I mean, “Who even cares?”)


Fast-forward to today, or rather, a month or so ago, when I saw all of the year-end Spotify banners fly by, and thought, “Hey, I could totally do this and retain ownership of my data.”


So, I looked at Libre.fm’s source code—Libre.fm being an open-source but somewhat outdated Last.fm alternative—and drew inspiration from it for a WordPress plugin of my own. (Of course, I later on discovered that such a plugin already existed.)


Luckily, mpdscribble’s config file—found at /etc/mpdscribble.conf—allows setting the Libre.fm endpoint. Pointed it to my site instead, and … it worked!


Next up was scrobbling from my Windows computer. I vaguely remember Last.fm’s Audioscrobbler plugins for, e.g., Winamp, but those would obviously have Last.fm’s endpoint hardcoded within (and have updated to the newer scrobbling protocol, while my endpoint only accepts the much older 1.2 version).


Anyhow, long story short: I mostly followed this Reddit post and used a hex editor to replace the default endpoint of a somewhat older foobar2000 plugin with my own! (The rest of the post isn’t super relevant.) Had to shorten my URL a bit to make it all fit, but: this, too, ended up working!


So … the result’s up at /listens. I’m still figuring out how to microformat these; don’t think too many parsers support listen-of.

byJan’s photoJan’s photo Jan Boddez • posted archived copycurrent

My Reading Stack

[M]y “Boring Reading Stack” uses Instapaper and my Kindle as the primary services to read content. […] My RSS reader of choice is Feedbin. […] The RSS feed of all my starred items is [piped] through IFTTT to Instapaper.


—Stefan Zweifel, Boring Reading Stack


Here’s mine:



  • I still read off either a smartphone screen or computer monitor

  • I’m using an RSS reader I built myself—well, it’s kind of a fork of Aperture meets Monocle, but quite different—and doubles as a read-it-later app

  • I don’t do newsletters, ’cept those that offer RSS, of course


I used to also use wallabag, a read-it-later app that bypasses cookie walls and such, and that I’d set up to sign into paid-subscription sites on my behalf, but I’ve found I no longer need it.


I’ve set up my RSS reader to fetch those paid articles instead, and I can send it random articles, too, using Micropub and a bookmarklet (or Indigenous on mobile). The way it works is I use those to create (private) “read” posts in my site’s CMS, which then adds, again through Micropub, the actual articles to my reader.


This is less of a hassle than it may sound: Indigenous, for instance, is already tied to my CMS (through IndieAuth—think “Sign in with Twitter,” but with your very own site instead).


Anyway, I’ve only got to host a single app anymore, and one that runs on just about any shared hosting platform at that!

byJan’s photoJan’s photo Jan Boddez • posted archived copycurrent

Micropub, Crossposting to Twitter, and Enabling “Tweetstorms”

I’ve previously talked about how I crosspost from this blog to my Mastodon account without the need for a third-party service, and how I leverage WordPress’s hook system to even enable toot threading.


In this post, I’m going to really quickly explain my (extremely similar) Twitter setup. (Note: I don’t actually syndicate this blog’s posts to Twitter, but I do use this very setup on another site of mine.)


I liked the idea of a dead-simple Twitter plugin, so I forked my Mastodon plugin and tweaked a few things here and there. Once I’ve installed it, and created a developer account, generated the necessary keys, and let WordPress know about them, things look, well, very familiar. In fact, crossposting should now just work.


Now, to enable this when posting through Micropub rather than WordPress’s admin interface! Again, since posting through Micropub means no WordPress interface, and thus no “meta box” and no checkbox, and no way for WordPress to know if I wanted to crosspost a certain article or not, I’m going to have to use … syndication targets (which were invented precisely for this reason).


First, I’m updating my Micropub config:


add_filter( 'micropub_syndicate-to', function( $syndicate_to, $user_id ) {
return array(
array(
'uid' => 'https://twitter.com/ochtendgrijs',
'name' => 'Twitter',
),
array(
'uid' => 'https://geekcompass.com/@jan',
'name' => 'Mastodon',
),
);
}, 10, 2 );

That should inform my Micropub client of the various endpoints my CMS supports.


Then, the callback. As always, there’s quite a few ways to get what we want. What this snippet does, is set the custom field that tells Share on Twitter if it should crosspost, and then simply re-trigger the hook used by that same plugin.


add_action( 'micropub_syndication', function( $post_id, $synd_requested ) {
$post = get_post( $post_id );
$syndicate = false;

if ( in_array( 'https://twitter.com/ochtendgrijs', $synd_requested, true ) ) {
// Update sharing setting.
update_post_meta( $post_id, '_share_on_twitter', '1' );

$syndicate = true;
}

if ( in_array( 'https://geekcompass.com/@jan', $synd_requested, true ) ) {
// Update sharing setting.
update_post_meta( $post_id, '_share_on_mastodon', '1' );

$syndicate = true;
}

if ( $syndicate && 'publish' === $post->post_status ) {
// Re-run the `transition_post_status` hook, in order to trigger
// syndication.
wp_transition_post_status( 'publish', 'publish', $post );
}
}, 10, 2 );

Note: As the callback above is run after the transition_post_status hook, which is where the actual crossposting would’ve happened, we have to trigger it once more (for published posts only).


And that’s literally all it takes to get syndication working in combination with Micropub! (Again, if you’ve seen my “Mastodon” implementation, all of this should be really familiar.)


Optional: Enabling Threaded Tweets


And, finally, the “Tweetstorm” bit, which is fairly ugly, mainly because I’m using get_page_by_path() instead of the somehow less robust—in my case—url_to_postid(). Also, part of this code references the specific custom post type I’m using for short-form content, and would not simply work elsewhere.


Oh, and lastly, this would require you filter the actual statuses, too, so that they contain actual notes and not just a title and permalink.


add_filter( 'share_on_twitter_tweet_args', function( $args ) {
$status = $args['status'];
$post = null;

$pattern = '#<a(?:.+?)href="' . home_url( '/notes/' ) . '(.+?)"(?:.*?)>(?:.+?)</a>#'; // `notes`, without `front`, is what's definied as my note archive's permalink slug.

if ( preg_match( $pattern, $status, $matches ) ) {
// Status contains a link to an earlier note. Try to fetch it.
$post = get_page_by_path( rtrim( $matches[1], '/' ), OBJECT, array( 'iwcpt_note' ) ); // `iwcpt_note` is my "note" CPT.
} else {
// Same thing, but for articles.
$pattern = '#<a(?:.+?)href="' . home_url( '/articles/' ) . '(.+?)"(?:.*?)>(?:.+?)</a>#'; // The exact argument of `home_url()` would depend on your permalink `front`.

if ( preg_match( $pattern, $status, $matches ) ) {
// Status contains a link to an earlier article. Try to fetch it.
$post = get_page_by_path( rtrim( $matches[1], '/' ), OBJECT, array( 'post' ) );
}
}

if ( $post ) {
$tweet_id = basename( get_post_meta( $post->ID, '_share_on_twitter_url', true ) );

if ( ! empty( $tweet_id ) ) {
$args['in_reply_to_status_id'] = $tweet_id ;

// Twitter demands replies contain the username of the person replying to.
$status = get_option( 'share_on_twitter_settings', array( 'twitter_username' => '' ) )['twitter_username'] . $status;
}
} elseif ( isset( $matches[1] ) ) {
error_log( "Could not convert URL to post ID for the page with slug {$matches[1]}." );
}

$args['status'] = trim( wp_strip_all_tags( $status ) );

return $args;
} );

Note: Still using get_page_by_path() rather than url_to_postid(), as that’s what works for me and my hacky WordPress install.

byJan’s photoJan’s photo Jan Boddez • posted archived copycurrent

Owning My Watch Later List

I don’t actually use YouTube a whole lot, but I’ve absolutely stolen Marty McGuire’s idea for “owning your ‘Watch Later’ list.” Here’s the—somewhat simpler, I think—approach I came up with:



  1. Follow YouTube channels via RSS

  2. Use Micropub to post interesting video URLs to my site

  3. Publish a list of the most recent URLs

  4. Have youtube-dl check that list at regular intervals, and process new videos only


That’s it.


I could’ve stuck with a purely text-based list, a GitHub gist perhaps, but since the site already supports different post types and Micropub … I chose the path of least resistance.


Some Technicalities


On my (Android) phone, I use Indigenous to post Reads to my Micropub endpoint. I’m not sure it supports the probably more appropriate—but whatevsWatch post type. It doesn’t actually matter, as I’ve instructed My WordPress install to interpret incoming “Reads” with a YouTube URL as videos that belong in the “Watch Later” list.


A bit of custom PHP then outputs a plain-text list of the ten latest such videos. (I may eventually also publish a proper, IndieWeb-style public archive and feed.)


On my media server, I use a cron job to first download that list, and then feed it to youtube-dl by means of the -a argument. I’m also using --download-archive to keep a list of already downloaded and processed videos.

byJan’s photoJan’s photo Jan Boddez • posted archived copycurrent

Would be best to enable the filter for the Godot base app, it's been getting a lot of work done on it. However, would this be something (the filter) that would be needed when the new means of deploying apps to AppCenter is out?