I looked in to using ZIPs/TARs, but, surprise surprise, I can't due to Lua/CC bugs. So, does anyone know a way to download all the files for a GitHub release in a single file (whether that be a base64 ZIP, a JSON structure, whatever). I really want to avoid having to download individual files, theres about 350 so far in version 2 and the vast majority of the time spent in the updater seems to be the HTTP connection, they basically all take the same time regardless of their size. Downloading the zipped files takes about 5 - 6 seconds. I might look in to hosting some PHP thing which spits out a JSON string with everything in it, but my hosting plan isn't really best friends with more than a dozen simultaneous connections.
This is a read-only snapshot of the ComputerCraft forums,
taken in April 2020.
Downloading a GitHub release ...quickly
Started by oeed, 24 February 2015 - 06:01 AMPosted 24 February 2015 - 07:01 AM
I've been remaking the OneOS updater and for some reason or another it's very stupidly slow. I'm talking 5 minutes. I've got five files downloading simultaneously and even when I increase this number it's still slow as ever.
I looked in to using ZIPs/TARs, but, surprise surprise, I can't due to Lua/CC bugs. So, does anyone know a way to download all the files for a GitHub release in a single file (whether that be a base64 ZIP, a JSON structure, whatever). I really want to avoid having to download individual files, theres about 350 so far in version 2 and the vast majority of the time spent in the updater seems to be the HTTP connection, they basically all take the same time regardless of their size. Downloading the zipped files takes about 5 - 6 seconds. I might look in to hosting some PHP thing which spits out a JSON string with everything in it, but my hosting plan isn't really best friends with more than a dozen simultaneous connections.
I looked in to using ZIPs/TARs, but, surprise surprise, I can't due to Lua/CC bugs. So, does anyone know a way to download all the files for a GitHub release in a single file (whether that be a base64 ZIP, a JSON structure, whatever). I really want to avoid having to download individual files, theres about 350 so far in version 2 and the vast majority of the time spent in the updater seems to be the HTTP connection, they basically all take the same time regardless of their size. Downloading the zipped files takes about 5 - 6 seconds. I might look in to hosting some PHP thing which spits out a JSON string with everything in it, but my hosting plan isn't really best friends with more than a dozen simultaneous connections.
Posted 24 February 2015 - 07:17 AM
Beats me what's going on with your download speeds (presumably you're attempting to pull whole files down at a time, and not line by line or something odd like that?), but you might be able to do something with this, perhaps.
Posted 24 February 2015 - 08:48 AM
Grin is really the best way to solve this problem. I don't mean to self promote, but it really does have a lot of advantages.
- It's fast
- It allows binary contents
- It's based on releases, so versioning is easy and you can separate released stuff from development stuff
- Best of all, nothing is dependent on Grin. As in, I can make an installer based on Grin in about 2 lines of code, host it on pastebin, and a user can pastebin run CODE without worrying about having Grin installed. (Example, JVML-JIT's installer).
- Also, if it's important to you, Grin lets you use the -e (or, -emit-events) flag to instruct it to not print any statuses, and instead emit events for another program to pick up on and display. (Example, grin-get's install command, although honestly that has no reason to utilize the feature)
Edited on 24 February 2015 - 07:54 AM
Posted 24 February 2015 - 10:57 AM
I've been remaking the OneOS updater and for some reason or another it's very stupidly slow. I'm talking 5 minutes. I've got five files downloading simultaneously and even when I increase this number it's still slow as ever.
I looked in to using ZIPs/TARs, but, surprise surprise, I can't due to Lua/CC bugs. So, does anyone know a way to download all the files for a GitHub release in a single file (whether that be a base64 ZIP, a JSON structure, whatever). I really want to avoid having to download individual files, theres about 350 so far in version 2 and the vast majority of the time spent in the updater seems to be the HTTP connection, they basically all take the same time regardless of their size. Downloading the zipped files takes about 5 - 6 seconds. I might look in to hosting some PHP thing which spits out a JSON string with everything in it, but my hosting plan isn't really best friends with more than a dozen simultaneous connections.
Use a packaging method, that is your best bet!
Posted 24 February 2015 - 11:14 AM
I'll take a look at that program soon.Beats me what's going on with your download speeds (presumably you're attempting to pull whole files down at a time, and not line by line or something odd like that?), but you might be able to do something with this, perhaps.
I'll also take a look at Grin. If I can though I'd really like to keep it a single file.Grin is really the best way to solve this problem. I don't mean to self promote, but it really does have a lot of advantages.
- It's fast
- It allows binary contents
- It's based on releases, so versioning is easy and you can separate released stuff from development stuff
- Best of all, nothing is dependent on Grin. As in, I can make an installer based on Grin in about 2 lines of code, host it on pastebin, and a user can pastebin run CODE without worrying about having Grin installed. (Example, JVML-JIT's installer).
- Also, if it's important to you, Grin lets you use the -e (or, -emit-events) flag to instruct it to not print any statuses, and instead emit events for another program to pick up on and display. (Example, grin-get's install command, although honestly that has no reason to utilize the feature)
How does binary contents work in Grin? The HTTP API is the issue.
Posted 24 February 2015 - 11:31 AM
It sounds like Package and Grin do much the same thing - that is to say, they each allow you to transmit a compressed archive of multiple files by making use of base64 conversion - though Grin is geared specifically towards GitHub usage, and will thus probably suit your needs better.
Package offers a built-in, um, packager (which compresses and base64's your data), whereas Grin requires you to compress and base64 the files separately using external tools. I'd consider that a minor point, however, as Grin can still unpack them on its own.
Package offers a built-in, um, packager (which compresses and base64's your data), whereas Grin requires you to compress and base64 the files separately using external tools. I'd consider that a minor point, however, as Grin can still unpack them on its own.
Posted 24 February 2015 - 11:33 AM
It sounds like Package and Grin do much the same thing - that is to say, they each allow you to transmit a compressed archive of multiple files by making use of base64 conversion - though Grin is geared specifically towards GitHub usage, and will thus probably suit your needs better.
Package offers a built-in, um, packager (which compresses and base64's your data), whereas Grin requires you to compress and base64 the files separately using external tools. I'd consider that a minor point, however, as Grin can still unpack them on its own.
Ok, I don't have the energy to read over them at the moment, but I'd have to make 'built' base64 file then upload that to GitHub to release it, rather than just using the GitHub API to do stuff?
Edited on 24 February 2015 - 10:33 AM
Posted 24 February 2015 - 11:48 AM
Well, you could use the GitHub API to upload the base64 file, I suppose. Grin comes into play when it's time to do the actual downloading.
Posted 24 February 2015 - 01:27 PM
I'll also take a look at Grin. If I can though I'd really like to keep it a single file.
How does binary contents work in Grin? The HTTP API is the issue.
So Grin works by expecting the first asset in a Github release to be a .zip.base64 encoded file. The fact that it's base64 is what allows it to be successfully downloaded through the HTTP API. The release used is the most recent one, or one specified by command line arguments. The advantage of this approach is that release builds are separated from the source repository. This means that users won't be installing in-dev code, and that your dev suite can have things like a test suite that aren't suitable to be distributed with the release, allowing the use of minimal space on the end-users' computers.
Grin works by using the Github Api to look at the information on a repo's releases. If a tag is specified in the command line arguments, that release is selected. Else, the latest release is selected. Then, the first element in the selected release's assets list is downloaded, and expected to be a .zip.base64 encoded file. The unnamed argument in the command line arguments is the directory for this .zip file to be extracted to.
EDIT: And yes, you're installer will be single file. But it would run Grin through "pastebin run VuBNx3va". The idea here is that Grin wants to be a transparent program to the end user. They don't even have to know that Grin is happening. It just gets called, almost like a function, through pastebin.
Edited on 24 February 2015 - 12:36 PM
Posted 10 April 2015 - 02:20 PM
Use Compress. It creates a single file from all the file structure. For example.
Pastebin:
Compress.lua OmniOS OmniOS_15w15b
It will output a file with the name OmniOS_15w15b. The when you run the file this way:
OmniOS_15w15b OmniOS
It uncompresse all the files into that directory. OmniOS, which is 300 kb gats downloaded in 5 seconds and installed in another 10. Be sure to check it out.Pastebin:
pastebin get 1rQJ9wC7 Compress.lua