There is one piece I didn't mention - in order to support a user copying in a new version of a file, if the last modified timestamp as reported by Google Drive doesn't match the last modified timestamp that I recorded in the hashcodes file, I have to recalculate the hash. If I only compared cached hashcodes, then there is no way for me to detect when a user updates a file. Due to that fact, I always have to query for every file to check it's last modified timestamp to see if it's changed. That's probably what is taking 10 minutes. Let me know if you have any thoughts on this. One option would be for me to add a checkbox in the settings to "ignore last modified timestamps" so that I won't check for modified files and only rely on cached hashcodes. That would speed up the processing significantly and it would probably only take a couple of seconds like you expect.
Mike
EDIT:
A better label for the setting would probably be "Check for updated files in cloud folders".
Mike
EDIT:
A better label for the setting would probably be "Check for updated files in cloud folders".