Show newer

One useful thing about working across multiple projects at once is all the potential for cross-pollination.

The p2p search indexing relates to the community web archival realates to the mesh network content loading optimization relates to local-first web apps and relates to cooperative governance models.

It's like planting seeds in a bunch of places and slowly weaving the trees together into a larger structure.

Teaching another remote dev how some of our 3D scene code works from inside the app with live audio and the code up on a screenshare via WebRTC is wild. Like I can show him the code and then right in front it I can show him how that code effects the world

@webxr code

Super pumped to be chatting with the person behind unit.land/ to talk about how we can use in @agregore to make it easy to visually assemble applications with zero extra code.

Saving and loading entire Unit graphs is super easy with IPFS+Fetch and you can even have a zero-server collaborative live coding environment just by plugging libp2p pubsub into the Units graph editor function.

u have to be very quiet when hacking so the coputer doesn't hear you. thats why its called ssh

Taking a small step back, and I really think I'd love to work on full time. I like the other projects I'm working on right now, but they just don't spark joy in the same way.

For any users interested in / tech, the latest prerelease version of has some fixes that make it more stable with M1/M2 macs along with general upgrades.

github.com/AgregoreWeb/agregor

Where the heck do streaming sites (of the not corpo variety) even source their content? Like, it feels like they're getting data more reliably than torrents these days.

One thing about me is that extrinsic motivation doesn't work very well for me. Rewards and punishments are just tiny blips. Generally I can only really do stuff that I genuinely believe in and only really do things because I think they're right rather than out of obligation.

This can make it really hard to force myself to do stuff I'd rather not for the sake of surviving under crapitalism. :P

Just annual reminder that you’re not too old to take up a sport or a hobby or learn a language or rollerskate or learn martial arts or join a debate club or learn an instrument and play in an orchestra. Adults can do that too. It’s totally allowed.

Show thread

Just absolutely floored that the state-of-the-art in keeping language models on the rails is "give it a really firm talking-to about staying on script" and that this approach has been publicly and embarrassingly proven not to work multiple times, and that they still just keep trying with more elaborate and forceful pep talks

The support seems neat. I heard somewhere that there's a way to perform a query on multiple DB backends at once, so it'd be cool to see if that's possible here.

sqlite.org/wasm/doc/trunk/api-

With that in place you could query data from multiple peers together without needing to merge their datasets.

This is kinda the approach I took for HyperBeeDeeBee in applications where multi-author queries were important.

Show thread

I feel like I should be using in use cases way more.

Only thing that's unclear, really is how write throughput would work. It seems like doing periodic dumps of datasets is the best use case there, but that doesn't play as nice with applications UX where people expect stuff to sync on the fly.

This "Wikipedia as a static DB" use case is extremely cool for example.

static.wiki/en/Peer-to-peer

Neat, somebody put together a Browser POC in @electronjs a couple of years ago.

I've been thinking of adding the concept of "containers" to @agregore and I think it'd be cool to have a "tor container" which disables and as much fingerprinting stuff as it can to help anonymize users when browsing via Tor.

It'd be cool to do something similar for

Sadly it'll have to be a TODO since there's no one funding work or devoting spare time to that. :P

Decentralized tech is hard to make because not only does it need to work great, the UX needs to be so much better than the state of the art (download an app from the app store that talks to a server, open a web page that loads _anything_ from a server) that someone would bother using it over their existing workflows. :P

Then again, UX wise, random HTTP servers that stream loads of data for free (with a bunch of annoying ads) beets out the UX of any BitTorrent thing. 😂 Just enter a link, search the thing, and bam It's right in your face. No need to fuss with clients (but you lose control over how the video is presented)

Show thread

Oh shit, I entered that headspace again that I call "BitTorrentMode" where I can't shake that the current state of is honestly behind what BitTorrent was in usability like a decade ago.

I love all the new protocols for their advantages, but the UX just isn't anywhere near "install some random client and paste a link".

They got it right and I wish it kept going instead of losing relevancy.

BEP46 would have solved the UX of needing to search for updated torrents as something is released.

> Body is a `FormData`
```
f = new FormData()
f.append('file', new Blob(["<h1>Hello There! :)</h1>"]), 'index.html')
```
> Add it to the request
> OH FUCK I broke the torrent thing and didn't have tests (guess I'll have to try again another day :P)

Show thread

Wanted to send a friend a little message.

> Open up @agregore
> Open the Docs for bt-fetch github.com/RangerMauve/bt-fetc
> Open a new window (Ctrl+N)
> For get how the hell to make a torrent
> Open the unit tests to see an example: github.com/RangerMauve/bt-fetc
> Open Devtoosl (ctrl+shift+i)
> It's a post Request

```
r = await fetch('bittorrent://localhost', {
method: 'post',
})
```

(cont)

Show older
Mauvestodon

Escape ship from centralized social media run by Mauve.