Delete originals

I’m creating a publishing workflow which will begin with large images exported from another application for this express purpose. As such, those large images aren’t needed once Retrobatch does its magic and generates other files.

Is there a best practice method to delete the original files? How, even, would it be done? I can only imagine immediately branching off the initial image fetch (Read Folder in my case) that would run a script to delete each file.

There’s no best practices for this yet.

A custom script could do it, but I’d test the heck out of it first before running it against your original images.

I tried this out tonight. My main pipeline goes Read Images > Set Metadata > Text Watermark and then splits to three copies of > Scale > Write Images. I added a new node directly off the Read Images > Shell Script.

The script is set to run at Every process and contains this:
#!/bin/bash
rm “$1”

At first attempt, Retrobatch crashed, but this was due to the script not having suitable permissions to execute. Once I fixed that, it behaved exactly as I wanted. I watched Finder as it worked and as each trio of output images appeared, so the original disappeared.

Ugh, a crasher!

That was an easy one to fix though- and I’ve done so in the latest build:
http://flyingmeat.com/download/latest/#retrobatch

Yikes! I was doing some tweaking to my workflow and once complete I ran it, only to find the delete preceded all the other processing. Net effect, it simply deleted all the originals and created no output.

So it seems the sequencing of two child nodes from the same parent is not determinstic.

Correct. Retrobatch makes no guarantees (or UI for that matter) about which child node will be called first.