Multi thread downloader

Ok so I’m making a better downloader for rsps caches and what I’m currently working on is a concurrentDownloader implementing threads to download file segments and assemble them post process but I’ve run into a bit of a snag being that the downloader in the client used http connections which allowed it to monitor what was coming in there for providing a down speed calculation and percentage unfortunately to download a file in segments you need to snatch the file size from the http header and divide it by the segments you want to download this creates a problem because after this the httpconnection is already opened and you can’t add properties to the header even if you close and open the connection again my specific problem is the “Range:” attribute which I need to post to be able to download only a certain part of the stream…

Any suggestions?

Oh! and this is client side only there is no accompanying server.

Edit: Well that problem was solved but as this is still a work in progress I suppose I’ll keep asking whenever I get stuck so onto my next problem…

So… I noticed my buffered reader allows the connection to die for x seconds before it throws this error: SSL peer shut down incorrectly

Anyone know how to set the timeout for this?

Edit: So everything now seems to be working pretty well except for one thing I’m having trouble with variable access…
My threads are overwriting each other’s variables and I don’t understand why everything is instanced and I’m even using local method variables any ideas?

I’m not sure you can add a timeout

The bottleneck is the clients download speed and servers upload. This is useless. Jaggrab was designed to request the needed files in the right order.

[quote=“Roar337, post:1, topic:554987”]So… I noticed my buffered reader allows the connection to die for x seconds before it throws this error: SSL peer shut down incorrectly

Anyone know how to set the timeout for this?[/quote]
If you haven’t solved that already, it’s the read timeout.

[quote=“Roar337, post:1, topic:554987”]So everything now seems to be working pretty well except for one thing I’m having trouble with variable access…
My threads are overwriting each other’s variables and I don’t understand why everything is instanced and I’m even using local method variables any ideas?[/quote]
we would have to see the code… it’s impossible for a modification of a variable declared inside a method to interfere with other invocations of that method.

[quote=“Justin Bieber, post:4, topic:554987”][quote author=Roar337 link=topic=673912.msg4505758#msg4505758 date=1458082867]
So… I noticed my buffered reader allows the connection to die for x seconds before it throws this error: SSL peer shut down incorrectly

Anyone know how to set the timeout for this?
[/quote]
If you haven’t solved that already, it’s the read timeout.

we would have to see the code… it’s impossible for a modification of a variable declared inside a method to interfere with other invocations of that method.[/quote]

^ this guy is just trying to steal your codes don’t post them.

silab is a full time troll ignore everything she posts

[quote=“lare69, post:6, topic:554987”][quote author=Miss Silabsoft link=topic=673912.msg4505848#msg4505848 date=1458303654]
^ this guy is just trying to steal your codes don’t post them.
[/quote]
silab is a full time troll ignore everything she posts[/quote]

When u read her posts your IQ drops

I am a mod and I can confirm what silab said do NOT post your code to Justin he is a KNOWN LEECHER. He steals and then tries to sell the codons on rune-server.

Jaggrab is flawed.

Edit: I finished my downloader unfortunately lacking the feature to resume but it’s capable of preforming a cache update almost instantaneously by using the previously discussed methods.

Also my jaggrab I’m not sure if they all do this or not but my jaggrab only works on main idx files making it super slow and it also requires a server this downloader however is entirely client sided so it won’t continuously open massive download connections on your vps.

protip: create a botnet, run bittorrent instances on botnet serving up your fresh cache files, simple.

[quote=“Roar337, post:9, topic:554987”][quote author=Davidi2 link=topic=673912.msg4505814#msg4505814 date=1458233899]
The bottleneck is the clients download speed and servers upload. This is useless. Jaggrab was designed to request the needed files in the right order.
[/quote]

Jaggrab is flawed.

Edit: I finished my downloader unfortunately lacking the feature to resume but it’s capable of preforming a cache update almost instantaneously by using the previously discussed methods.

Also my jaggrab I’m not sure if they all do this or not but my jaggrab only works on main idx files making it super slow and it also requires a server this downloader however is entirely client sided so it won’t continuously open massive download connections on your vps.[/quote]
Uh, jaggrab is designed for the “main idx files” because those are the cache files…
The rest of the data is supposed to be packed into them… They’re not because people couldn’t figure out how to do it properly (or broke their cache in the process) so they loaded them externally.

[quote=“Roar337, post:9, topic:554987”][quote author=Davidi2 link=topic=673912.msg4505814#msg4505814 date=1458233899]
The bottleneck is the clients download speed and servers upload. This is useless. Jaggrab was designed to request the needed files in the right order.
[/quote]

Jaggrab is flawed.

Edit: I finished my downloader unfortunately lacking the feature to resume but it’s capable of preforming a cache update almost instantaneously by using the previously discussed methods.

Also my jaggrab I’m not sure if they all do this or not but my jaggrab only works on main idx files making it super slow and it also requires a server this downloader however is entirely client sided so it won’t continuously open massive download connections on your vps.[/quote]The cache is hosted somewhere, whether that’s on your ‘server’ VPS or on a different hosting site like dropbox or whatever, there’s no such thing as a ‘100% client sided downloader’. Jaggrab is DESIGNED to request the cache files that are needed, saving you bandwidth in either case.

“Cache update almost instantly”. Yea, I bet you’re “updating” from files on your local PC, rofl.

@Sk8r This is so bad!!! why do you want to download 500mb worth of main cache files when you can preload models and download the folder and all the models and put the existing cache and the download together?

Downloading a full cache is a 30m-1hr process the process I described finishes in mere seconds and I’m aware that smaller files take longer to download when there is a large quantity and I thought of the solution of just creation new idx files once it came to that point if it ever did.

@Davidi you’re mistaken I shouldn’t say this is 100% client sided the files are hosted on a website however there is nothing to do there but upload it somewhere the downloader tracks all the files their names and their sizes and besides the website does not require a server sitting on vps.

Also @Davidi It’s true jaggrab does request only the NEEDED files the problem here is it deals with unnecessarily large files say you add a model with the size of 1kb to the main idx file jaggrab will end up downloading the whole idx because it is a packed file the whole idx being between 200-500mb in size for a single 1kb update which should be processed instantly.

If one desired to keep everything packed perhaps the models should be downloaded from somewhere as needed AND THEN packed by the client.

[quote=“Roar337, post:13, topic:554987”]@Sk8r This is so bad!!! why do you want to download 500mb worth of main cache files when you can preload models and download the folder and all the models and put the existing cache and the download together?

Downloading a full cache is a 30m-1hr process the process I described finishes in mere seconds and I’m aware that smaller files take longer to download when there is a large quantity and I thought of the solution of just creation new idx files once it came to that point if it ever did.

@Davidi you’re mistaken I shouldn’t say this is 100% client sided the files are hosted on a website however there is nothing to do there but upload it somewhere the downloader tracks all the files their names and their sizes and besides the website does not require a server sitting on vps.

Also @Davidi It’s true jaggrab does request only the NEEDED files the problem here is it deals with unnecessarily large files say you add a model with the size of 1kb to the main idx file jaggrab will end up downloading the whole idx because it is a packed file the whole idx being between 200-500mb in size for a single 1kb update which should be processed instantly.

If one desired to keep everything packed perhaps the models should be downloaded from somewhere as needed AND THEN packed by the client.[/quote]
you have absolutely no idea how the client updater works and should probably reevaluate the code dipshit.

you stick benie babies into a water pipe
you get 5 people to stick benie babies into a water pipe
but water pipe can only take so many benie babes even if you have 5 people shoving stuffed animals into the pipe
the pipe is your moms butt
same concept
it doesnt make it faster

The compressed cache files you create would be smaller than the “models and existing cache”.

wtf, a website still runs on a server

[quote=“sini, post:15, topic:554987”]you stick benie babies into a water pipe
you get 5 people to stick benie babies into a water pipe
but water pipe can only take so many benie babes even if you have 5 people shoving stuffed animals into the pipe
the pipe is your moms butt
same concept
it doesnt make it faster[/quote]
someone give this man some gold

[quote=“Justin Bieber, post:17, topic:554987”][quote author=Roar337 link=topic=673912.msg4505977#msg4505977 date=1458654084]
@Davidi you’re mistaken I shouldn’t say this is 100% client sided the files are hosted on a website however there is nothing to do there but upload it somewhere the downloader tracks all the files their names and their sizes and besides the website does not require a server sitting on vps.
[/quote]
wtf, a website still runs on a server[/quote]

Jesus…

silab is right justin bieber is an CODE SWIPER