On Thu, Dec 3, 2020, 7:04 PM professor rat <pro2rat@yahoo.com.au> wrote:
Coderman was just being kind and gentle with Karl, who is feeling fragile
and painful.  He is _not_ well...."

Yes...I said that was touching. Twice. 

Note also Cow-man was sending 'best regards' moo's to some known Nazi on this list.

The technical term for that is ' collaboration ' .

And Quisling traitors get stabbed, shot and hanged everyday. 

He's been warned. As has Karl.  And you too, while your here. 

Warning not received.  Warnings require clear communication, in the language of the listener, and confirmation they have heard.

I understand that trolls will fill communities I consider helpful.  I do not understand that I can prevent anything with complicit behavior.

Sorry for replying.  Let's see if I can add something helpful towards replying more how I choose ...

TLDR: (too long, don't read)
I tried out storing a month of the cpunks list on a blockchain, which would cost about dollar to store in plaintext if done more efficiently than I did.  This system is more $5 a month.  Here are the links:

2020-September
author https://bico.media/fedd0a8368dd68ae495279094435391f0e13291866af7a8a26aa182028af2df6
date https://bico.media/bd7fb31a5d7e685fcba3892fd28a7e4f7cc35c57576e7a7812a68746e48c15f1
subject https://bico.media/4fe2cc266634e04401d27e366529b83c1f61cecf7767ab53f4b426dcc0590970
thread https://bico.media/a41c50edfa8fc0c46d0f46ae82ac8c65e9f925f5c5a731006cb421318cd524e6

https://github.com/xloem/cpunks.org-bsv-partial.git

Below is how I did it.  I wrote everything I did into this ridiculously huge email, and somehow got somewhere.

I'll spend some time trying to archive this list on a blockchain.  Right now it is 4:06 PM my time.  I'm moving to a networked laptop.

It's 4:07 .  I reached the laptop.  Instead of using it, I turned on the light.  Now I feel suddenly thirsty.  I'm getting water.

I got water.  I drank some.  It's 409.  I'm still thirsty.  I'm sitting at the laptop.

I'm trying to figure out how to upload things.  I will write cognitive notes here to figure it out.  Bsvup may have an issue.  The issue is likely small.  I have an existing partial list archive.  I'll look for it briefly.  It's 410.

It's 411.  I found my partial archive on my raid array.  It's mounted in /media/3/archive/ .  I'll see if there's anything in there for this already.

I don't see any scripts at all or anything .  I started archival resuming by typing wget --mirror https://lists.cpunks.org .  It's 413 .

I'll try to set up bsvup for testing if it can upload anything right now.  The bsv API ecosystem can change a lot.

I saw the mood file is over three hundred megabytes and got discouraged.  That's so expensive!  But even a day's worth of archived email would be great.  I might be able to pass flags to bsvup to upload only a little bit.  I may have to hack something a little.  It's 415 .

I typed `bsvup init` and it worked!  I have bsvup installed on this system already.  Great.  I'll use an exchange to send some money to it since I don't remember where any of my folders or accounts are.  It's 416.

I forgot the name of the exchange I have bsv in.  It's 417.  Did it start with 'p'?  No, poloniex I think kicked me because of some USA law.  What was it?  Hey, I think I have a client app on my phone.  I'll look around.  It's 418.

Bittrex!  I think this app will even let me send coins.  I'll do that.  It's 419.

Dang.  I'm not signed in.  If I type the wrong password 3 times my account will get locked out and I'll have to go through a multi-day process to re-access it.  I remember my password is more complex than usual, to meet requirements they have.  I have one extra complex password I'm practicing memorizing to use for stuff like this.  I'll try that.  It's 421.

I'm not sure if I have the memorized password right, so I'm trying it somewhere else first to verify.  I got it second try and wrote it to remember after I look away.  It's 422.

Oh my god it worked!

They need me to do two factor authentication as well as email confirmation to log in.  So, this email gets drafted.  It's 424.

I'm logged in after trying the email confirmation a couple times!  I have a thousand dollars in here somehow.  Let's transfer some BSV.  It's 426.

I sent 0.01 - 0.001 fees = 0.09 bsv!  It takes it a bit to process the transfer, but once it's on the network it's possible to use it even before confirmation sometimes.  It's 429.

I'm not sure what to do now.  I could look for other accounts to see if I can get the money faster.  Maybe I'll set up the archive as a git repository while I wait.

The entire 2020-September folder is 3.9M .  I should probably figure out how much a single month costs.  It looks like these things compress down to under a megabyte.  I'm also aware that bsv's bcat:// storage protocol can handle gzipped data in a way that bsvup does not use.  It's 4:33 PM.

Goal: calculate prices.  Assume: 500 sat/KB.
Uhhh ...

I'm used to using python when my brain doesn't do units conversion for some reason.  I open python3.  It's 434.

Comes to about two million sat for september, the whole folder. I think there are a hundred million sat in a coin.  I'll check that.  Great!  Okay, so divide by that.  It's 436.

Comes to about 0.02 BSV.  So, while I wait for my first transfer, I'll start another that's that big.  It's 4:37.

I sent some more.  My first transaction went through.  The id is 6542f43294c8e7052d175a3fd3bb9bd130330517a349f563554acfe0c9df6264 .  I'll find this on a block explorer to see when it gets confirmed.

It's 4:41 .  Here's the transaction on blockchair: https://blockchair.com/bitcoin-sv/transaction/6542f43294c8e7052d175a3fd3
Sorry.  The url is corrupt.  My phone keeps glitching out with it.  It's not hard to find, though.  It's 4:42.

I'm looking at bsvup --help to see how to upload only a subfolder but keep the whole hierarchy.  I'll need to provide two options.  It's 443.

I draft the command in a blank script to eat two birds with one stone.

It's 444.  I've typed `bsvup --file localdirectory --subdirectory remotedirectory`.

I saw the quoted text below this message and got scared.  Not sure why people would be in different groups.  I don't support labeling people in judgemental ways.

It's 447.

I'll need to find a subfolder to watch bsvup crash and fix it.  lists.cpunks.org/pipermail/cypherpunks/2020-September .  I'll just hardcore that into my script for now.  Hey, the local folder is the same as the remote folder!  That's great!  It's 448 .

Done.  449.  Let's test this out.

Ran it but nothing happened.  Reviewing content.

I need to pass a command to bsvup in addition to flags.

dir=lists.cpunks.org/pipermail/cypherpunks/2020-September
bsvup --file "$dir" --subdirectory "$dir" upload

It's 451.

I ran it!  It's checking whether the files "exist" on the chain.  I think this code might be partly broken.  It's harmless since none of them are likely to, that I am aware of.

It's checking a lot of files and could run into API rate limits, which could break the upload.  No errors reported so far.  It's 452 .

While I wait,  I'll add the script to the git repo and upload it somewhere.  My github account is the place that is the most psychologically accessible to me.  It's 453.

It's 454 and I made https://github.com/xloem/cpunks.org-bsv-partial .

My bsvup run has finished reviewing the data.  Now I'm scrolling up to check for any glaring problems, before confirming the upload.

The total price was set to 3620399 sat at a fee of 1 sat/byte.  The BSV network will accept .25 sat/byte .  I'm going to restart it with 0.5 sat/byte, which is what its default API server requires last I remember (maybe a year or two ago).  It's 457.

bsvup --file "$dir" --subdirectory "$dir" --rate 500 upload

While that's running again, I'll split the script into two.  One to update the local archive, and another to do the trial upload.  It's 459.

It's 500 PM and I've uploaded the two scripts to the github repository.  The test upload is still re-checking for existence.

Seems like waiting and focusing on my task is what to do now.

The existence checks finished.  Reviewing.

Total price: .01946 bsv somehow.

I just remembered I need to include the bsv address in the git repo.  Maybe I'll remember when I finish this upload review.

I started the upload.  It's about a thousand transactions total.  It's 503.  While it's running, I'll add the bsv stuff to the git repo, including the encrypted private key.

Y'know, maybe I'll keep it encrypted, and set the password to something guessable.

I'll have to set up a test folder to see if I can change the passphrase from an empty string without destroying it.  It's 504.  The upload has exhausted the server mempool and is still in progress, waiting for the next block.

I'm setting up a notes pane to write down what I figure out about the encryption key, since I don't remember things when I look away from them.  I often don't even remember that I have that issue, but I'm familiar with trying to use a computer while having it.  It's 506.

goal: change a bsvup encryption key

note: --key lets one set a key, likely WIF
note: I do not see a way to export a key

I'm thinking what would be most effective here would be to use the nodejs terminal or write a quick nodejs script that loads bsvup as a library and calls its internals to change the paraphrase.  It's 509.

Step 1: find functions for saving loading key, in current bsvup install.  `type -p bsvup`.

readlink --canonicalize /bin/bsvup

It uses its Cache module, calling loadKey(password) to load the key.  It's 511.

There is also Cache.saveKey(key, password).  Sounds easy.  It's 512.

At the top of the file, I see Cache is the result of `require('./cache.js')`.

So the module I need is /usr/lib/node_modules/bsvup/cache.js .

Thirsty again.  513 PM.

I started mutating the key, but I should note the test address first so I can verify mutation didn't alter it.

K.  It's in my notes pane.  515 Pm.

Let's mutate this encrypted key, as a test.

I mutated it without harm with:

c = require('/usr/lib/node_modules/bsvup/cache.js')
c.saveKey(key.loadKey(oldpass), newpass)

I'll back up my real key before mutating it.  It's 517.

I think I mutated it successfully.  I'll check that block explorer url to see if the address is right.  It's 519.

Yes, it looks correct.  I'll add it to the github repository.  I just realized that I'm not aware of any inproxies that will display the D:// url this upload will make, although I have been out of the loop for a while, so I'll also want to make an index page to see the content.  Anyway, d:// urls are mutable and can be misleading.  It's 521.

I added the keyfile to the git repo, so the script would likely at least run if somebody else tried it, and had bsvup installed, which may be on npm, but likely a different version.  It's 524 pm.

946 transactions are left to broadcast.  I'll work on making that index page.

Reminder to self:  this subfolder is lists.cpunks.org/pipermail/cypherpunks/2020-September

It looks like the html files use relative urls often, which is great.  Many of the url are absolute, too, though.

Since I'm mirroring the folder structure I could mutate arbitrary links to direct transaction ids using a tool I made that I forget the name of at the moment.  Something about bsv.  It's 529.

I guess I'll want to mutate the threads, authors, etc html files.  Maybe make files sitting alongside them?  It would be nice if the whole list were archived referencing transactions.  That would take some hacking to figure out the ids before uploading, and might be unreasonable due to circular linking.  I could make a page that mutates on the fly using javascript...  hrm.

I'll just mutate the index pages for now.  It's 532.

I'll use sed to extract urls.  I'll put it in a script and find my tool once I have the urls.

Y'know, to fix the weird interdependencies,  I should mutate them a certain way.  Individual message pages can have navigation links removed.  Navigation pages can have links to navigation removed.  Then every navigation link can be a transaction id.

I'll just bite the bullet and accept that that means I'm uploading undesired data.  Maybe I can make it flexible, to use this data now and use new data if I make new data.

So, first check if a link has a mutated version, then if it has a normal version.  Pages are in categories that can be identified with functions.  Each category has some links disabled.

It's 546.  I'm looking up how to provide alternates to sed's grouping expressions.

556.  The index pages have urls that cross linebreaks, breaking sed.  Blargh.  This incredibly common issue should be fixed inside sed but never will be.  Hmm.  558 I found a sed hack that will work.

K.  I got the links fillered and listed with sed.  Now to find my tool.  It's in that same xloem user on github.  Bitfiles! Yay.  602 pm.

not installed on my system.  I'll try npm.

sudo npm install -g bitfiles

Hey!  A block got confirmed!  More transactions are sending.  It's 604 pm.  Maybe we can get more up by being first in the mempool.

I hope this version of bsvup has the transaction confirmation patch I added.  It's looking like it _doesn't_, so some files may need reuploading.  I can probably upload them by upgrading bsvup and changing them to pending.  I should probably do that after the upload, if I can.

Luckily, bitfiles --help outputs something, which is not often true of scripts I quickly create.  It looks like I can find a direct link to a file using something like `bitfiles status d://address/path' .

My phone is at 15% battery.  I'll plug it into this computer?  Nah all the ports are clogged with confused storage devices.  I'll grab a flaky backup battery that's charging.

Plugged.  609 pm.  Phone glitched a little.

Adding bitfiles to script.

Awww ... bitfiles crashes when run.  This usually happens quickly when I stop maintaining something.

Oh!  The error is just that the files aren't found yet; still uploading maybe.  Yay.

Yeah bsvup uploads the D:// record last ... hmm ...

I'm running it on all the links, sending stderr to /dev/null, just in case.

The unuploaded transactions are stored locally ... but I don't have an interface made to work with them that way.  And I'm not running a bsv node; it takes longer to sync than my computer can run without crashing.

It's 614.  I'll upload my half-made script to github in case I forget I'm doing this.

Uploaded.  It's 616 and I'm getting worn.

We don't actually need the D url for anything.  Only the actual transaction ids.  The transactions are stored in the .bsv folder, and bsvup likely has library functions for accessing them on some level.

The D transactions store the path information.  I believe each file has its own D transaction.  So, the way to find a file is by enumerating them all.  In nodejs, I think the bsv library can parse the transactions.  ... struggling to continue ...

It's only 619.  My arms feel soggy; it takes a lot of effort to make them obey me.

The D format is based on the bytecode within the outputs of a transaction.  It's a set sequence of data opcodes.  I most recently worked with it in python, adding reading functionality to a lib with pluggable API backends.  I'm thinking I'll add something to bitfiles rather than adding to that.

Bitfiles usually uses a bitquery server to find transactions, which I don't have, so maybe I'll just write something handmade quickly.  It's 622.

Okay.  Go through every transaction that .bsv has.  Find the D:// transactions, and identify a filepath.  Use nodejs to read the bsvup data, since I can copy code from bsvup and bitfiles.

Step 1: make a script, and enumerate the bsvup transactions.

Hey wait!  Bsvup just finished broadcasting!
It did not check for confirmations.  So some files are likely dropped.  But it caches its broadcasts.  It's 624.

I'm wondering if bitfiles has bug where everything is not found.

It's outputing no D transactions from that address at all.  Time to clone its repo.  Hey, I already have a checkout!  It's 626.

I'll check a specific transaction hash.  In bitfiles you can check a TX with `bitfiles status TX://hash` .

It looks the API server isn't replying.  It's a bsv API server that was deprecated by the maintainer and replaced by another.  I have a branch where I abstracted the backend out but never managed to complete that.  I wonder if I can find the new server ... maybe my branch is on this system?  I don't think I ever uploaded it but maybe I did?

I did upload it =) git worktree add wip origin/wip .

bitbus was the next server!  Let's see if I can backport the changes.  It's 633 and I have a headache.  But my arms are feeling better.

There's a funny thing in here.  After I started working with bsv, they updated the protocol so all transactions mutated.  To read both old and new data, you have to support both forms now.  It's a little confusing.  It's 636.

I copied this code from bsvup, keeping a lot of the same format because I was so confused.  It seems easier to work on things successfully when chunks are copied from or referencing other things. Obviouslyreferg .  My phone glitches a bit.  It's 639.

I'm having difficulty continuing.  I've written the main query retriever for bitbus.  I know it doesn't work because I haven't tested it, and I always make bugs.  It's 644.

I'll add a comment that it's wip and upload it to bitfiles.  I'll also ask the maintainer of the old API server if he could give it a kick, and ask if there are other tools for checking d:// transactions.

I'm remembering polyglot, the python code I worked on a little, might do that, now.  Maybe I won't ask that guy yet.  It's 646.

I uploaded the changes.  Let's look for polyglot.

The polyglot cli tool does upload only.

The download source does not do d:// transactions yet.  They are not complicated.  The effort is comparable to the backporting attempt.

I'm getting pretty confused now.  It's 651.

I'm thinking of looking at the bsvup transactions manually to find their references.  Each one has its path as a string, and the txid.  I can grep them for the path.

I'm going to get food and water.  I have emptied my 20 oz 592 ml water bottle and my mouth still feels dry.  It's 653 pm.

Back.  659.

I didn't remember what I was doing, but finally noticed I  was staring blankly at a description of it that I'd written.  On to grepping.

The txt are stored in hexadecimal, probably to ease dev effort sending them to an API server.  I use xxd -r -ps to decode them.

sed doesn't match binary characters with '.' :-/ .  It's 704 and I am ravenously eating this food.

The txid is terminated by a \x01 character in my example transaction.

I google the issue and learn of the trick of passing the data through strings.  

The D transactions have an ascii identifier in them.

All the D txids are preceded by an @ sign.

I form a regex and it extracts a lot of message TX ids but none of the index page txids, so there's a bug.  It's 714 pm.

One of the characters in my regex represents the pathname link.  I change it to a '.'.

It now finds every path and txid.  I add it to the script to not lose it.

I test one of the txids and it works.  It was randomly a post by me.  I'll qrencode it to transfer to this phone and link it here.

Blockchained cpunks post: https://bico.media/0a0539cf8670eca6d6f668dcd0c8fe475e536b3f15df7c91057b24e0de1964e7

It's 7:21 pm.

Okay.  So link mutation should be reasonable now.  No bitfiles needed.

It's 723 pm.  I'm having trouble continuing.  I finished my food.

It's 725.  I've accessed the mutation script again, and altered it to output the link list.  I need to mutate the links into the D paths I made, which means making them absolute if not, and prepending the hostname.  I'm having trouble continuing.  I keep losing things in my working memory while trying to work.

I don't remember what I'm doing.  Something about bsvup.  Paths.  Mutating something.

I have to do something to mutate links.

In the text file, I have a function that produces a list of path and txid mappings.

I plan to store its output in a file, and regenerate the file when there is a miss, maybe.

Right now,  I want to make the links absolute.  The next step is to run the script, and look at the output.  I just did that.  I'm doing it again.

It's 7:29.

Relative links start with a nonslash character.  They are to be preceded with the dirname of the path, which should also be relative.

Script now finds txid for links.  739.

745.  I have the bash script mutating all the links in a pipe.  I'm not sure yet how to form a mutated output name, but it shouldn't be too hard.

751.  Figured out output name.  I'm thinking on adding a link to original files, in mutated files.  Maybe in some part of the visible html.  Working on a bug in finding transformed names.

754 the bug was a logic error that I have fixed but still don't understand or remember.

759 working on a sed bug where links aren't being filtered from individual messages.  Having difficulty continuing.

The sed script is to match everything.  Oh, maybe I found the error.

Nope.  But here's another, links that break lines again.

Okay.  It's 8:34 pm and this link mutation bash hack is working.  I uploaded it to git.  It processes only a single file and is kinda slow.  The immediate suspect for slowness would be running a while loop that processes each link one by one, each one using shell commands.

I suppose if I want to upload all these mutated pages I'll have to put another $2 in the account.  First I'll look to see if I can link to the original file easily.

It seems it could work fine to insert an <i> tag at the start or end of the body, like the footer does.

8:58 .  I'm working with a confusing bug adding the link back to the original where output inside a pipe seems to disappear inside the pipe.  Maybe it would be easiest to just do it a different way.

9:11 pm.  Link-to-original seems to be working.  I may have added bugs to the script and am not sure how to detect this reliably.  While it remutates all the messages to not link to each other, since the links would change the transaction hash, I'll transfer another $2 in.

9:25 .  The mutation finished.  Some of the inter-messages links are broken at the end.  I was planning on attending an event in 5 minutes but do not have access to it.  I uploaded a basic readme to the git repo I made.

9:28 The broken links are fine; they weren't supposed to work yet.  Setting the data uploading.

9:36 it wants to upload the existing files =(   I will just move them for now.

9:40 uploading from subfolder.  If I'm awake when this finishes, I want to try to remember to upgrade bsvup and check that all the transactions confirmed.  Many transactions may be nonrebroadcastable because I am doing multiple upload runs, which could respend unspent change.

944 the individual message upload is running.  I'm thinking I'll go to bed.  Maybe if I leave this email draft open, I'll resume.

950 I went to drop the backlight on the laptop and saw a block was confirmed.  Thinking all the transactions are likely to get in the mempool now.

10:00 pm. I measured the speed and it should finish before 10:10 .

The reason these tools are so crummy on my end is because I get nasty flashbacks when working on them, so I look for the quickest and easiest result.  Ideally an upload tool would be written in something more robust than nodejs and bash, and connect directly to the p2p network.  I don't know where to find something like that, and getting this far has been very difficult.  There are a lot of protocol changes that could be made to drop the price, too.

I remembered the tool that has D protocol details in it.  It was by the same guy who made bsvup ... a small inproxy.

18 transactions left.  With more attention here, the bcat protocol could be used to splice together the same webpage parts for the two different kinds of pages, so the double uploads wouldn't be needed.  That would of course need a custom uploader.

Uh-oh ... still uploading after those 18.  Guess I had the total wrong ...and 

Finished at 10:11.  Let's mutate those index files.

10:12 mutated with no crashing !

10:14 uploading these index files.  Done.

Let's try 'em out.

Dang.  The transactions show two thread index files that differ, neither of them saying .txlink .  I think I made an error uploading.

1017 yeah I had uploaded the same files.  Likely it is only the transaction ids that are different.

1019 reuploading

1020 the mutated indices are up but the mutation is wrong.  The message links are broken.

Debug mutation.

Looks like the same bug as before.  Sed script not survive pipe.

Missing dollar sign.  Was writing to variable name instead of content.  1023.

Links look good locally now.  Regenerate, reupload.

It's funny to look at computer screens for me.  When things change on the screen, my eyes deconverge and I see double.  I've gotten used to it, just looking at things with one eye when it happens.  Sometimes I forget to reconverge them, since when they're staying that way it can't initiate again.

1027 verified the links locally in a browser.  Reuploading.

1029 the API server mempool ran out with one transaction remaining.  Gotta wait for next block.

1034 the indices that have already been broadcast are working.

10:36 I tried to delete the transactions from my cache that had the wrong content so that links won't be formed to them if I continue.

Let's share the links to the indices.

Now 10:45p, transferring links.

2020-September
author https://bico.media/fedd0a8368dd68ae495279094435391f0e13291866af7a8a26aa182028af2df6
date https://bico.media/bd7fb31a5d7e685fcba3892fd28a7e4f7cc35c57576e7a7812a68746e48c15f1
subject https://bico.media/4fe2cc266634e04401d27e366529b83c1f61cecf7767ab53f4b426dcc0590970
thread https://bico.media/a41c50edfa8fc0c46d0f46ae82ac8c65e9f925f5c5a731006cb421318cd524e6

https://github.com/xloem/cpunks.org-bsv-partial.git

orrupt from phone glitches quoted text below:

Gramps, Batshit and james DUMBFUCK are all KNOWN FASCISTS.

You rub shoulders with them at your own peril.  Kabeesh?