Using RetroBatch with iCloud Photos?

After moving away from Google Photos, I definitely miss the sophistication of their classification software.

Photos.app is pretty useless when it comes to identifying photos of my dog. Retrobatch’s classification works great. My ideal workflow would be: 1) select photos in Photos.app 2) run Quick Action to trigger Retrobatch workflow 3) run workflow and write a keyword to the image in my iCloud Photos library.

Is something like this achievable? I am fine with a solution that includes some scripting as well.

The trick is going to be getting Retrobatch paths to the original files. I’m not sure of any way to do this in Photos- since sharing extensions and “Edit In…” stuff all make a copy of the master file.

I’m not 100% up to date on Photos.app’s capabilities though- but do you think you can get the path to the master files from a Photos selection?

@ccgus After a little more digging, I’m afraid not. I’m feeling like this might be an all AppleScript endeavor since any time images get passed to Retrobatch a copy gets created.

I am thinking this is going to require AppleScript to interaction with Photos media objects and some type of TensorFlow CLI solution. Does that sound crazy?

Retrobatch creates a copy, but you can always copy them back to the original location as well.

But- I’m curious what you can come up with TensorFlow as well!