Information about the closure of the project

Hey Folks,

We’ve decided that we’re shutting down the project at the end of the month. Please make backups of your important files, you have about two weeks to do so. Until then, the site will run without any changes.

There are several reasons for the closure:

– Since 2006 we have been on the market in an unchanged form, that is, as ad financed/free file hosting. However, you have been visiting in less and less over the years, as the arguably very simple formula of the services we offer is slowly running out of steam. I guess all the competing file storage service companies on the market look better, offer better performance and more features. No one needs a dinosaur like us anymore.

– All sorts of adblockers, whether built into the browser, as add-ons, or in the form of DNS services. Sure, we all use them, but they take away any control the site owner has over the site. Eventually we get to the point where a vicious cycle begins, in order to pay for the server infrastructure you are forced to place more and more ads, then users fire up more and more adblockers and we get to a point like today.

– Rising electricity prices. Over the past year, electricity prices have gone up 2.5 times, which, with a large number of servers, gives a significant increase in costs that we have no way to balance.

There are still a bunch of smaller reasons, but we could write a book on this, and probably no one would want to read it.

To sum it up, we can no longer afford to maintain the site.

You can send us any comments to (we’ll read them all, we’ll probably respond to just a few):
support@zippyshare.com

Thanks for being with us over the years.

See you in the depths of the Internet. o7

Zippyuploader update, Docs and Videos encoding performance boost

Hey folks,

A quick update:

– Today a new version (0.0.16.0) of the Zippyuploader was released. The only change is related to an increased file size limit of 500MB. If something went wrong during the update, download the Zippyuploader again from our website.

– We are thinking about releasing the sourcecode of the Zippyuploader to the public, maybe we have a few dope C++ developers as users, which would like to add a few new funtions to it. Let us know if you are interested.

– Today a new Docs/Videos encoding system has been put into production. All the Docs/Videos are transcoded via a bunch of Nvidia GTX1080 graphic cards to an usable/previewable format. Transcoding is fully hardware-based with use of NVDEC/CUDA(scaling via NPP)/NVENC and its performance is up by 2000% in comparision to the previous CPU based solution. We have also shredded the GlusterFS (file system) which we used to transfer files to the transcoding farm and replaced it with MooseFS/LizardFS mainly due to GlusterFS stability issues.

End of technobabble.

May the Force be with you! :-)

Edit 8.12.2017 23:59 CET:
Can we fix it?Practicaly everyday we have to restart services around 23:00 0:00 CET to push the new patches on the production environment to iron out some bugs in the docs/videos encoding farm. Unfortunately it may be a bumpy ride with us at this particular time of the day. Helmets are mandatory to prevent head injuries from sudden upload/download interruptions. Aaaand buckle up!! :-)

… and the first update of 2015 is behind us :-)

After quite a long time we finally managed to put together o meaningful update.

What has been changed?

– Link format. Links are now alphanumeric and case sensitive (eg. www99.zippyshare.com/v/abcdEFGH123/file.html).

– Html5/flash uploader. The old uploader caused some nasty issues. We migrated to the newest version of the battle proven plupload library.

– No CAPTCHA reCAPTCHA. There are people out there who abuse our TOS by bypassing our download website and hotlink protection in order to use our site as free cloud storage for their applications. Such misuse of our service had a disastrous impact on the performance of our site and the user experience for our users. During peak hours servers where overloaded and bandwidth exhausted. We had to introduce the shiny new ReCaptcha straight from Google in some specific circumstances. For most downloads nothing will change. Most of the others will only need to check the “I’m not a robot” checkbox without the need to put in any words/numbers to download from our site.

– Live Stats. We have rewritten the mechanism used for collecting statistics. We are still fine tunning it so it will take a few days until stats will be up and running.

There were also a lot of changes “under the hood” which we hope will enable us to provide a more smooth and stable experience in the future.

Plans for the near future:

– a few bugfix updates are expected during the next few weeks.
– evaluation of our ad partners.
– adding new servers in order to rise the maximum file size.

Maintenance / Technical entry

Hey folks,

This is maintenance/technical entry, we will update it when something goes wrong.

Edit 21.12.2013:
Sorry for a little slowdowns on 41/42 and 43/44, we had to swap a few HDDs and run the raid rebuilding. ETR: 48h

Edit 7.06.2014:
31/32 was down for the last 24h due to problems with raid array. We had to swap a few HDDs (4 to be exact). Sorry for the inconvenience guys.

Edit 16.06.2014:
Another hardware clusterfuck. 55/56 is offline for the last 48h due to problem with file system. We currently migrating all the data onto backup server. No ETR at this point.

Edit 20.06.2014 12:00 CET:
www55/www56 is up and running. No data loss. Sorry for the inconvenience guys.

Edit 15.08.2014 0:00 CET:
55/56 is once again offline. This time we had to replace 2 HDDs – RAID is rebuilding. Server should be up and ready during the weekend.

Planned works:
Migrate 69/70 and 42/43 on backup servers due to poor I/O performance.

Edit 5.12.2014 20:00 CET:
We had some problems with our frontend cluster (you may experienced problems with upload, access to user accounts and public profiles, emails delivery)

Problem started around:
23:00 CET 3.12.2014 to 12:00 CET 4.12.2014
0:00 to 1:00 CET 5.12.2014.

We closely monitor whole situation. Sorry for the inconvenience folks. :-/

Quick update

We are still alive!

Some of You were concerned by the lack of new messages on our blog so we took the opportunity to give You an update. Everything is going fine. We don’t have any particularly exciting new features to brag about so we didn’t see the need to bombard You with marketing bullshit.

Here comes a small list of changes since the last blog post. I bet 99% of You didn’t notice a single one of them:

  • we cleaned up the code responsible for handling file links a little. It was seriously messed up and in dire need for rewrite. The old code generated a few dozen different links all of which pointed to the same file. That wasn’t a real problem but we still needed to change it. Currently every valid link not in the form of http://wwwXX.zippyshare.com/v/1234567890/file.html will be redirected to the right address.
  • we have added additional capacity to our aggregation switches in order to prevent congestions during peak hours. Speed should be a little better in some specific circumstances.
  • we have added SSL support to our main site (storage nodes excluded). You are free to browse through You account website using https now.
  • we changed the hosting company for our CDN (additional POPs in Asia, South America, Europe, SSL also supported)
  • we spend quite some time on streamlining the process of handling abuse/dmca notifications.

Currently we are working on the following:

  • changing the mechanism used for communication between our servers to a more bulletproof one.
  • getting those statistics fixed! We had a major crash of our Cassandra cluster about 4 months ago. Before the crash the data returned from our DB did resemble data returned by a random number generator. It’s very unlikely that we will be able to restore the historical data (don’t ruling it out though). We jumped on the “Big Data” bandwagon quite to early (Cassandra 0.5). It seemed to us as a good idea back then. A start over should work better in the future.

Unfortunately in order to get things right this time both the changes mentions need to be introduced together so it might take us a little more time.

What we are thinking about doing:

  • adding some new storage nodes and increasing both the max size and live time of files. No decision reached yet so don’t bet anything on this happening any time soon.

What we are NOT working on:

  • a new layout. We do know that it is time for a major overhaul of our website. We tried to do something about it a couple of times but weren’t able to find a company which would be eager to work on it. All of our layouts so far where created by our programmers and administrators. In case You wondered why Zippyshare is and always was so ugly You now know the truth :)

Connectivity/Speed Issues

Hey Folks,

For the last 48h some of you may encountered ugly speed problems from part of our infrastructure. We fight with this all day, and we won. Our ISP had to switch whole our infrastructure on the backup router. Sorry for inconvienience.

Ps. Please make sure that you bomb us with the sh*tload of support tickets when such situation happens again!!

Zippyshare technobabble once again. :-)

Hey Folks,

We owe You some information about any issues with our website You might have encountered lately. We talk about the following:

– slow speeds on some of the servers

– long waiting time for the download to begin after the download button is clicked

– missing links after the upload process

All of this was an effect of a policy change on our hosters side. The changes prevented us from upgrading our infrastructure as we had to check for other offers. As we couldn’t add hardware for some time the load on our currenet servers peaked.

Good news is that we were able to get a good agreement for some decent new hardware from another hoster. We got the first batch just 2 days ago and started the migration of a few of the most loaded and failure prone servers. If everything goes as planned Zippyshare will be back to its usual top performance by the end of the month.

Keep You fingers crossed and give us some time to get the job done. :-)

Edit 14.10.2012: www7/8/9 – data migration process started – ETR: 96h or more

Edit 15.10.2012: www10/11/12 – data migration process started – ETR: 96h

Edit 16.10.2012: www21/22/23/24 – data migration process started – ETR: 96h

Edit 16.10.2012: www17/18/19/20 – 107 inodes to go – 1100 already checked – ETR: 24-48h + time needed to data migration (we will not risk losing any more data by putting the server online when we don’t know what caused data corruption in the first place – everything in megaCLI looks ok)

Edit 17.10.2012: An update about 7/8/9, 21/22/23/24 – the DNS entries have been altered to point to the new machines. Not all of the data has been migrated till now. This means that some of the data on the servers will be not accessible for some time. No we are not completly crazy I can assure You. :-) The old servers where overloaed and where down for about 50% of the time so we decided to move some data to a new machine, redirect the traffic and move the rest “in background”. For the next 48h You might encounter files which after download have 0kb size. Don’t worry the file hasn’t been migrated from the old server. Just try again in a few hours time. We do our best to keep it as comfortable for You guys as we can. It will still be a bumpy ride unfortunately.

Edit 18/19.10.2012: 17/18/19/20 – up! 7/8/9, 10/11/12, 21/22/23/24 – has been successfully migrated. We got the second batch of new servers.

Edit 20.10.2012: We just started moving data from 1/2/3, 4/5/6, 13/14/15/16, 17/18/19/20, 25/26/27/28.

Edit 21.10.2012: 17/18/19/20 – has been successfully migrated.

Edit 25.10.2012: 13/14/15/16, 25/26/27/28 – has been successfully migrated.

Edit 28.10.2012: 1/2, 5/6 – has been successfully migrated.

Edit 29.10.2012: 3/4 – has been successfully migrated.

Edit 01.11.2012: Some users reported problems with file downloads (incomplete downloads, connections drop, timeouts etc.)… we have reconfigured our servers a bit, please report us any issues You might encounter

Edit 7.11.2012: www3/4 still fsck (it’s not a bug it’s a feature) :-/

Edit 10.11.2012: We got the third (last one) batch of new servers. We just started moving data from:
www29/30/31/32, 33/34/35/36, 37/28/39/40, 41/42/43/44, 63/64 – only data migration – ETR: 96h
www65/66 – the DNS entries have been altered to point to the new machines. For the next 48h You might encounter files which after download have 0kb size. Don’t worry the file hasn’t been migrated from the old server. Just try again in a few hours time.

Edit 12.11.2012: 65/66 – has been successfully migrated.

Edit 13.11.2012: www45/46/47/48/49/50/51/52 – has been successfully migrated.

Edit 14.11.2012: www29/30/31/32, 33/34/35/36, 37/28/39/40, 41/42/43/44, 63/64 – has been successfully migrated.

Edit 15.11.2012: Holly shit, reCaptcha everywhere… sorry folks, we are testing a theory about file downloads by some “automatic software” which cause very high load on our servers. We hope that the reCaptcha will be only a temporary solution.

Edit 20.11.2012: www3/4 – It’s Alive! 😀 No data lost, but it was a bit humiliating downtime for us. :/

Edit 30.11.2012: We just added another 10gbps to one of our aggregation switches to prevent congestion in peak hours. What this means for you? Better download speeds of course. :)

Edit 1.12.2012: www53/54/55/56/57 – has been successfully migrated.

Edit 9.12.2012: New servers added into load balancer, now, please give a warm welcome to www67/68/69/70/71/72/73/74… more to come. 😉

Edit 26.01.2013: New server added into load balancer, please give a warm welcome to www75/76