I’m trying to decide how to import my Google Photos Takeout backup. I see two general ways:
- Import it by uploading it to Immich (immich-go, etc.)
- Add it as an External library
Has anyone done it one way or the other? Any recommendation, pros/cons or gotchas?
I recommend using this: https://github.com/TheLastGimbus/GooglePhotosTakeoutHelper
A couple years ago, Google decided that instead of exporting the photos with EXIF data exactly as you’ve uploaded them, which was the original behavior and how platforms such as OneDrive do it, they are going to completely delete all EXIF from the image and instead also create a .json containing the original data, in a non-standard format. This script is an open and free version of a paid tool that goes through each image, finds the corresponding .json, and puts the EXIF data back on.
If you don’t do that, when you reupload these photos into a new service, the date will be reverted to the day you’ve downloaded them and location data will be missing entirely.
Yes! I imported 23k media files into a new platform, and the takeout process was such a pain. My destination was built to handle the zipped or unzipped media, but occasionally issues cropped up,like when files spanned archives but the json was on the previous one. That resulted in orphaned files with upload dates instead of date taken.
Ultimately, I think I had the best experience extracting all 123GB and uploading the albums/folders that way.
Would have been SO much easier with an API that allowed cloud to cloud.
This was the tool I used. It worked great for me.
Google reminds me more and more of Microsoft of the 90s. That’s exactly the kind of compatibility breaking asinine move MS would do 30 years ago. Sigh…
I wonder if this is worth doing even if I import with
immich-go
which seems to combine this data too.