arisuchan    [ tech / cult / art ]   [ λ / Δ ]   [ psy ]   [ ru ]   [ random ]   [ meta ]   [ all ]    info / stickers     temporarily disabledtemporarily disabled

/tech/ - technology

Name
Email
Subject
Comment

formatting options

File
Password (For file deletion.)

Help me fix this shit. https://legacy.arisuchan.jp/q/res/2703.html#2703

Kalyx ######


File: 1492804165135.jpeg (38.31 KB, 698x400, game.jpeg)

 No.68

In this thread we post websites that hold useful or otherwise useful resources but don't directly allow downloading them, and look for a way to get the data for personal storage and sharing.
As this might be considered a wargame of sorts, I think we should use spoiler tags around the solutions when we post them.

Disclaimer that goes without saying: no personal /r/ing, doxxing, or stuff like that. Let's keep it interesting.
I don't know if we should upload the data ITT or just leave the answer so that lainons can download it on their own, What do you think?. Also, does anbody know what's this site's max file upload size?

 No.71

File: 1492807118531.jpg (151.89 KB, 500x448, netlabels.jpg)

Here's one challenge.

http://www.actsofsilence.com/netlabels/ hosts a big list of Creative Commons Netlabels. That means that these sites upload music that's free to download and share, but it's difficult to do so because it's spread across around 300 different publishers that use different hosting methods for the music (bandcamp, archive.org, direct download, etc). I think what this would take is:

1. For each netlabel, find a way to automate the processes of:
1.a. Downloading all the music
1.b. Tagging the music files appropriately.
1.c. Checking for new publications/uploaded music that you don't have locally.
2. Write a script that, when run, creates a directory named after the relevant netlabel and performs those three actions there.
3. Write a script that, when run, takes a bunch of said netlabel-specific scripts and runs them all in a certain directory, building an encompassing library.
4. As long as the internet exists: expand the program adding CC netlabels to it.

I'm going to loop through step 1 myself, other iterators should feel free to join me, pick a netlabel from the list and look under its hood. Hacky happing!

 No.72

>>71
I went after some of the netlabels that offer direct downloads. However, I didn't realise that Acts Of Silence kept a .opml file with all the rss links for the netlabels that have one available, so the downloading part is solved for the ones that are listed with a rss link. As long as the rss covers all releases and not just the latest ones, that is.

In Abstrakt Reflections the URLs for the .zip albums look like http://www.abstraktreflections.net/downloadN, where N is a number from 1 to 174 at the time, throwing that into a text file and feeding it to wget or aria2 will do 1.a., note that it would download each album in all two or three available audio formats. Their music is well tagged, so they themselves took care of 1.b. Iterating over N and waiting for their "Download does not exist!" page would do 1.c.
AlchEmistica was similar, http://www.alchemistica.net/AUDIO/alfdNNN.zip, iterating from 001 to 009 and gets you the files. Everything else is like the previous label.
Altema is just the same, URLs look like http://www.altemarecords.jp/release/altm_NNN/data/altm-NNN.zip.
Buddhistonfire is similar, the links are of the form http://www.archive.org/download/bofNNN/bofNNN_vbr_mp3.zip except for N=004, whose URL is http://www.archive.org/download/bof004a/bof004a_vbr_mp3.zip.
Cryoworks: http://www.cryoworks.com/Releases/owoNNN.zip
CYAN: http://www.cyan-music.com/releases/CYAN-NNN-FLAC.zip
rain: http://www.archive.org/download/rain059/rain059.zip
Resting Bell: http://raw.media.sonicsquirrel.net/restingbell/rbNNN/rbNNN.zip

 No.86

>>71
If they are on archive.org, these will solve the downloading:
https://blog.archive.org/2012/04/26/downloading-in-bulk-using-wget/
https://internetarchive.readthedocs.io/en/latest/cli.html#download

However, I can't get it to search by user. I've found otherman record's account (@corekrave), but searching for it does not return all the uploads.

 No.1091

Brap.org

 No.1092

>>71
youtube-dl will download most stuff you give it, it should handle direct links, archive.org, and bandcamp.



[Return] [Go to top] [ Catalog ] [Post a Reply]
Delete Post [ ]