deleted by creator
Goddamn hope this story gets somebody at google’s attention. Off topic, even though it was mentioned in the article, what ended up happening to the dad’s account, was it reinstated? I can’t find an update
deleted
Maybe they’ll help him retrieve the data. Presumably the servers haven’t been used for something else yet. Then again maybe not. When you control how most people get their news who cares if one reporter gets mad?
It isnt just one that is going to have an issue with this.
deleted by creator
from “don’t be evil” to stunts like this in basically no time flat. #capitalism!
How is Google getting rich off his free account?
Ok, so I think the timeline is, he signed up for an unlimited storage plan. Over several years, he uploaded 233TB of video to Google’s storage. They discontinued the unlimited storage plan he was using, and that plan ended May 11th. They gave him a “60 day grace period” ending on July 10th, after which his accouny was converted to a read only mode.
He figured the data was safe, and continued using the storage he now isn’t really paying for from July 10th until December 12th. On December 12th, Google tells him they’re going to delete his account in a week, which isn’t enough time to retrieve his data… because he didn’t do anything during the period before his plan ended, didn’t do anything during the grace period, and hasn’t done anything since the grace period ended.
I get that they should have given him more than a week of warning before moving to delete, but I’m not exactly sure what he was expecting. Storing files is an ongoing expense, and he’s not paying that cost anymore.
Exactly. People love to “cry foul” when Google does stuff like this but it’s completely unrealistic to think you can store 278 TB on Google’s server in perpetuity just because you’re giving them like $20-30/month (probably less, I had signed up for the Google for Business to get the unlimited storage as well, IIRC it was like $5-$10/month). It was known a while ago that people were abusing the hell out of this loophole to make huge cloud media servers.
He’s an idiot for saving “his life’s work” in one place that he doesn’t control. If he really cares about it that much he should have had cold-storage backups of it all. Once you get beyond like 10-20 TB it’s time to look into a home server or one put one in a CoLo. Granted, storing hundreds of TBs isn’t cheap (I had 187 TB in my server across like 20 drives), but it gives you peace of mind to know that you control access to it.
I have all my “important” stuff in Google drive even though I run my own media server with like 100 TBs but that’s because I tend to break stuff unintentionally or don’t want to have to worry about deleting it accidentally. All my important stuff amounts to 33 GB. That’s a drop in the ocean for Google. Most of that is also stored either on my server, the server I built for my parents, or pictures stored on Facebook.
Wait, journalist, 233 terabyte? Just what in the fuck did his life’s work consist of?
Removed by mod
It’s simply stupid to not compress to h265 before uploading it.
For people authoring original content who may end up having the only copy of a given piece of news-relevant data in their possession, using a lossy compression method to back it up sort of defeats the purpose. This isn’t stashing your old DVD collection, this is trying to back up privileged professional data.
https://x265.readthedocs.io/en/master/lossless.html
https://trac.ffmpeg.org/wiki/Encode/H.265#Losslessencoding
I want to clarify that it supports lossless compression as well.
Just some advice to anyone who finds themselves in this specific situation, since I found myself in almost the exact same situation:
If you really, really want to keep the data, and you can afford to spend the money (big if), move it to AWS. I had to move almost 4.5PB of data around Christmas of last year out of Google Drive. I spun up 60 EC2 instances, set up rclone on each one, and created a Google account for each instance. Google caps downloads per account to 10TB per day, but the EC2 instances I used were rate limited to 60MBps, so I didn’t bump the cap. I gave each EC2 instance a segment of the data, separating on file size. After transferring to AWS, verifying the data synced properly, and building a database to find files, I dropped it all to Glacier Deep Archive. I averaged just over 3.62GB/s for 14 days straight to move everything. Using a similar method, this poor guy’s data could be moved in a few hours, but it costs, a couple thousand dollars at least.
Bad practice is bad practice, but you can get away with it for a while, just not forever. If you’re in this situation, because you made it, or because you’re cleaning up someone else’s mess, you’re going to have to spend money to fix it. If you’re not in this situation, be kind, but thank god you don’t have to deal with it.
Wow. That’s a lot of “homework”.
If the company was run by a hallucinating AI it couldn’t be any flakier.
I have a problem with Amazon Drive going away for non-photos on December 31st.
For a while, they had unlimited storage and you could use a Linux API to access it – I stored 8TB of data.
Then they set a quota, but for those over quota it was read-only. Oh, and Linux access no longer works.
Now they’ve set a deadline to have everything off by December 31st, but the Windows app still doesn’t work (constantly crashing) and I see no way to get my files.