JoWie Posted December 5, 2009 Report Posted December 5, 2009 You should really improve the sequential file id stuff, there is a firefox extension which does the exact same thing as that script, takes like 10 seconds to set up.The problem wasn't the sequential id's... it was downloading each file with a connection, which kind of killed the server.which is really easy to get access to, at my school i usually download files at 10MiB/s Quote
L.C. Posted December 5, 2009 Report Posted December 5, 2009 (edited) You should really improve the sequential file id stuff, there is a firefox extension which does the exact same thing as that script, takes like 10 seconds to set up.The problem wasn't the sequential id's... it was downloading each file with a connection, which kind of killed the server.which is really easy to get access to, at my school i usually download files at 10MiB/sAt my workplace we have 100Mbit fiber. The problem was also.. Although the download function on the script did wait to complete for a move-on, the assessment was that because 30% of the database consisted of approximately 1300 of the files, and there was no wait timer on the script too many PHP requests were made in a very short amount of time. If you have a very powerful internet connection of great downstream bandwidth, plus you throw in a script with no wait time between requests, and you have hundreds of very small files (relative to internet connection), this results with server rape within a small amount of time due to so many requests being made per second. Edited December 5, 2009 by L.C. Quote
Dr Brain Posted December 5, 2009 Report Posted December 5, 2009 The point was that ANYONE can do that with basically no work. Testtube needs to change it if he doesn't want it to happen again. Quote
krslynx Posted December 5, 2009 Report Posted December 5, 2009 Bandwidth wasn't the issue. Cerium covered that, and CRe's connection isn't sufficient enough to bring a 10MBPS uplink to it's knees (even if CRe has a decent ISP, his connection will have a HUGE share ratio), even though TTs connection is serving 69 other websites. What brought the server to a slow-down was the PHP requests, as L.C. quoted me on earlier, over 1300 of them. Although it could be future prevented with perhaps a few lines of code, really SSDL should be redesigned but it's a question of who'd actually redesign SSDL. Quote
»Xog Posted December 5, 2009 Report Posted December 5, 2009 Bandwidth wasn't the issue. Cerium covered that, and CRe's connection isn't sufficient enough to bring a 10MBPS uplink to it's knees (even if CRe has a decent ISP, his connection will have a HUGE share ratio), even though TTs connection is serving 69 other websites. What brought the server to a slow-down was the PHP requests, as L.C. quoted me on earlier, over 1300 of them. Although it could be future prevented with perhaps a few lines of code, really SSDL should be redesigned but it's a question of who'd actually redesign SSDL. u mean like creating randomly generated reference IDs to each download so that it isn't as easily accessible by doing "download linkid=1, 2, 3, etc"like.. for file 1, a randomly generated ref. ID could be something like WLa9G31, something a bit more difficult to search for to get all 1300 download links for a script to d/l like that? too bad I have little to no experience with servers / coding, so I have no idea how that would be implemented.. but I'm sure there's some type of program that can be made to create randomly generated linkIDs for each file Quote
»Ceiu Posted December 5, 2009 Report Posted December 5, 2009 Bandwidth wasn't the issue. Cerium covered that, and CRe's connection isn't sufficient enough to bring a 10MBPS uplink to it's knees (even if CRe has a decent ISP, his connection will have a HUGE share ratio), even though TTs connection is serving 69 other websites. What brought the server to a slow-down was the PHP requests, as L.C. quoted me on earlier, over 1300 of them. Although it could be future prevented with perhaps a few lines of code, really SSDL should be redesigned but it's a question of who'd actually redesign SSDL. I call BS on that too. Unless the site is designed in a horrific way, that's just not a possibility. My server sporting a 1.2ghz Athlon Thunderbird would laugh at that. Regardless, the excuses that "it hasn't been a problem before" or "they shouldn't have done it" are stupid. You can't expect usage trends to remain constant nor can you expect users to know/care about your server setup. As Jowie pointed out, it's trivial for even non-devs to setup something like they did; it was simply a matter of time until someone actually had a reason to do it. The entire "problem" stems from poor server management. Each website could have been given an individual resource slices to prevent this (via more virtualization software), Apache modules could have been put in place to make sure a single end-user isn't raping him with requests/bandwidth usage, ssdl could have been designed to only allow X downloads over Y time intervals, etc. There are just so goddamn many ways this could have been prevented that can still be put into place to prevent this from happening, yet Testtube would rather refuse to take responsibility and do nothing; which is fucking irritating to see. Even more annoying is that he's blatantly lying about the extent of the damage. You can't tell me that the site was down for "several hours" due to PHP requests raping the server, yet also claim that your bandwidth usage was above "the 95th percentile" at the same time -- the math simply doesn't work. If he was raping that much of your bandwidth, it wouldn't have taken several hours to download 2gb (it'd be roughly 30-45 minutes). If he was raping your processor, then you wouldn't have the processing time to even send that much data. But hey, who needs to be accurate when you can play the role of the martyr because you're throwing $60/mo at this game, right? What really got to me in this whole situation, is that lynx was fucking banned for doing nothing (again, lulz) short of disagreeing with someone who's part of the Ol' Boys club. Clearly that's grounds for removal. It's this elitist "we can do no wrong" attitude which is doing a disservice to this game and the human fucking race. Do the rest of us a favor and either grow up or drop dead. tl;dr:TT, Swift and anyone else in similar positions:If you assholes don't want to run shit properly, then don't get all pissy when someone comes along who's willing to attempt to do it better. Faggots. Quote
L.C. Posted December 5, 2009 Report Posted December 5, 2009 (edited) There is nothing stopping anyone from creating their own SSDL webportal software on their own server resources. Edited December 5, 2009 by L.C. Quote
»doc flabby Posted December 5, 2009 Report Posted December 5, 2009 I'd be happy to design a new download site.... Quote
Sketter Posted December 6, 2009 Report Posted December 6, 2009 Lets get this over with. Mod please lock this. The drama has happened, and ended. Every one is happy, ( me thinks ) and despite good old SS drama (it actually was good and entertaining ) it's all over. Cheers to a good read Sketter Quote
aquarius Posted December 6, 2009 Report Posted December 6, 2009 I'm pretty sure a lot of us are waiting for TT to provide the files. Quote
»freakmonger Posted December 6, 2009 Report Posted December 6, 2009 I personally would like to see a new/better design at SSDL. I know me and TT have talked about it in the past and he's tried to find somebody to help out but to my knowledge nobody has rogered up to help out. I've been looking for commerical type software for Mega File Hosting but havn't found anything like we need yet. Quote
krslynx Posted December 6, 2009 Report Posted December 6, 2009 (edited) I'd be happy to design a new download site.... ^^ - There's your designer. To be honest, with RubyOnRails I can't see it being hard at all to redesign SSDL from a performance perspective, while also having somebody who's into making things look pretty design all of the CSS/JS/images for the site. Edited December 6, 2009 by krslynx Quote
»doc flabby Posted December 6, 2009 Report Posted December 6, 2009 I'd be happy to design a new download site.... ^^ - There's your designer. To be honest, with RubyOnRails I can't see it being hard at all to redesign SSDL from a performance perspective, while also having somebody who's into making things look pretty design all of the CSS/JS/images for the site.If someone can design the CSS/JS/Images for the site to make it look pretty, i'd be happy to (code it/adapt an existing open source project) could probably intergrate the login details with ssforum using the API, which would be idea, single login for all ss sites Quote
Hakaku Posted December 6, 2009 Report Posted December 6, 2009 If you want a new design that pressingly, try dropping larrythehamster a note. Quote
L.C. Posted December 6, 2009 Report Posted December 6, 2009 Perhaps the template of subspaceonlines.com could be used? One way to make everything look all the same and universalized. Testtube has uploaded the ZIP archive, weighing 2.45GB. Will have this uploaded and ready for download within some hours.. Quote
»D1st0rt Posted December 6, 2009 Report Posted December 6, 2009 Seriously, guys? Seriously? I mean, come on... Seriously? Quote
Trained Posted December 11, 2009 Report Posted December 11, 2009 Fucking amazing.. I cant believe I missed out.. so so sad. Quote
L.C. Posted December 11, 2009 Report Posted December 11, 2009 >> New thread here << (For those that did not notice.) Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.