Compare commits
10 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 65e352e8a4 | |||
| 38083d8a46 | |||
| ce676dea72 | |||
| 8548cc1f0f | |||
| 63202b53f1 | |||
| 2836a40616 | |||
| 19a8c2b997 | |||
| 217cd87feb | |||
| a0a023859a | |||
| 87f8ad4dae |
1
.gitignore
vendored
Normal file
1
.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
.venv/
|
||||||
37
README.md
37
README.md
@@ -26,8 +26,9 @@ reverse_geolocate.py [-h] -i
|
|||||||
[-x [EXCLUDE XMP SOURCE FOLDER [EXCLUDE XMP SOURCE FOLDER ...]]]
|
[-x [EXCLUDE XMP SOURCE FOLDER [EXCLUDE XMP SOURCE FOLDER ...]]]
|
||||||
[-l LIGHTROOM FOLDER] [-s]
|
[-l LIGHTROOM FOLDER] [-s]
|
||||||
[-f <overwrite, location, city, state, country, countrycode>]
|
[-f <overwrite, location, city, state, country, countrycode>]
|
||||||
[-g GOOGLE API KEY] [-o] [-e EMIL ADDRESS] [-w]
|
[-d [FUZZY DISTANCE]] [-g GOOGLE API KEY] [-o]
|
||||||
[-r] [-u] [-a] [-c] [-n] [-v] [--debug] [--test]
|
[-e EMIL ADDRESS] [-w] [-r] [-u] [-a] [-c] [-n]
|
||||||
|
[-v] [--debug] [--test]
|
||||||
|
|
||||||
### Arguments
|
### Arguments
|
||||||
|
|
||||||
@@ -38,6 +39,7 @@ Argument | Argument Value | Description
|
|||||||
-l, --lightroom | Lightroom DB base folder | The folder where the .lrcat file is located. Optional, if this is set, LR values are read before any Google maps connection is done. Fills the Latitude and Longitude and the location names. Lightroom data never overwrites data already set in the XMP sidecar file. It is recommended to have Lightroom write the XMP sidecar file before this script is run
|
-l, --lightroom | Lightroom DB base folder | The folder where the .lrcat file is located. Optional, if this is set, LR values are read before any Google maps connection is done. Fills the Latitude and Longitude and the location names. Lightroom data never overwrites data already set in the XMP sidecar file. It is recommended to have Lightroom write the XMP sidecar file before this script is run
|
||||||
-s, --strict | | Do strict check for Lightroom files and include the path into the check
|
-s, --strict | | Do strict check for Lightroom files and include the path into the check
|
||||||
-f, --field | Keyword: overwrite, location, city, state, country, countrycode | In the default no data is overwritten if it is already set. With the 'overwrite' flag all data is set new from the Google Maps location data. Other arguments are each of the location fields and if set only this field will be set. This can be combined with the 'overwrite' flag to overwrite already set data
|
-f, --field | Keyword: overwrite, location, city, state, country, countrycode | In the default no data is overwritten if it is already set. With the 'overwrite' flag all data is set new from the Google Maps location data. Other arguments are each of the location fields and if set only this field will be set. This can be combined with the 'overwrite' flag to overwrite already set data
|
||||||
|
-d, --fuzzy-cache | distance | Allow fuzzy cache lookup with either default value of 10m or an override value in m or km
|
||||||
-n, --nobackup | | Do not create a backup of XMP sidecar file when it is changed
|
-n, --nobackup | | Do not create a backup of XMP sidecar file when it is changed
|
||||||
-o, --openstreetmap | | Use OpenStreetMap instead of the default google maps
|
-o, --openstreetmap | | Use OpenStreetMap instead of the default google maps
|
||||||
-e, --email | email address | For OpenStreetMap with a large number of access
|
-e, --email | email address | For OpenStreetMap with a large number of access
|
||||||
@@ -87,6 +89,12 @@ openstreetmapemail = <email>
|
|||||||
|
|
||||||
if no -g or -e flag is given the keys are read from the config file. If the -g or -e parameter is given it will override the one found in the config file. A new parameter can be written to this config file with -w parameter.
|
if no -g or -e flag is given the keys are read from the config file. If the -g or -e parameter is given it will override the one found in the config file. A new parameter can be written to this config file with -w parameter.
|
||||||
|
|
||||||
|
### Cache lookups ###
|
||||||
|
|
||||||
|
If the same GPS coordinate is detected no other API maps call is done. With the fuzzy-distance argument this can be further extended to certain distances for each GPS coordinate from each other. The default value is 10m and can be overriden with an value to the argument.
|
||||||
|
|
||||||
|
Can be used to force cache on GPS coordinates that are very close to each other but not exactly the same.
|
||||||
|
|
||||||
### Google data priority
|
### Google data priority
|
||||||
|
|
||||||
Based in the JSON return data the following fields are set in order. If one can not be found for a target set, the next one below is used
|
Based in the JSON return data the following fields are set in order. If one can not be found for a target set, the next one below is used
|
||||||
@@ -123,16 +131,17 @@ order | type | target set
|
|||||||
After the script is done the following overview will be printed
|
After the script is done the following overview will be printed
|
||||||
|
|
||||||
```
|
```
|
||||||
=======================================
|
========================================
|
||||||
XMP Files found : 57
|
XMP Files found : 57
|
||||||
Updated : 3
|
Updated : 3
|
||||||
Skipped : 54
|
Skipped : 54
|
||||||
New GeoLocation from Map : 2
|
New GeoLocation from Map : 2
|
||||||
GeoLocation from Cache : 1
|
GeoLocation from Cache : 1
|
||||||
Failed reverse GeoLocate : 0
|
GeoLocation from Fuzzy Cache : 0
|
||||||
GeoLocaction from Lightroom : 1
|
Failed reverse GeoLocate : 0
|
||||||
No Lightroom data found : 46
|
GeoLocaction from Lightroom : 1
|
||||||
More than one found in LR : 0
|
No Lightroom data found : 46
|
||||||
|
More than one found in LR : 0
|
||||||
```
|
```
|
||||||
|
|
||||||
If there are problems with getting data from the Google Maps API the complete errior sting will be printed
|
If there are problems with getting data from the Google Maps API the complete errior sting will be printed
|
||||||
@@ -148,11 +157,11 @@ Also the files that could not be updated will be printed at the end of the run u
|
|||||||
|
|
||||||
```
|
```
|
||||||
...
|
...
|
||||||
------------------------------
|
----------------------------------------
|
||||||
Files that failed to update:
|
Files that failed to update:
|
||||||
Photos/2017/02/some_file.xmp
|
Photos/2017/02/some_file.xmp
|
||||||
```
|
```
|
||||||
|
|
||||||
### Tested OS
|
### Tested OS
|
||||||
|
|
||||||
This script has only been tested on macOS
|
This script has only been tested on macOS
|
||||||
|
|||||||
@@ -22,7 +22,7 @@ import re
|
|||||||
# Note XMPFiles does not work with sidecar files, need to read via XMPMeta
|
# Note XMPFiles does not work with sidecar files, need to read via XMPMeta
|
||||||
from libxmp import XMPMeta, consts
|
from libxmp import XMPMeta, consts
|
||||||
from shutil import copyfile, get_terminal_size
|
from shutil import copyfile, get_terminal_size
|
||||||
from math import ceil
|
from math import ceil, radians, sin, cos, atan2, sqrt
|
||||||
|
|
||||||
##############################################################
|
##############################################################
|
||||||
# FUNCTIONS
|
# FUNCTIONS
|
||||||
@@ -70,6 +70,20 @@ class readable_dir(argparse.Action):
|
|||||||
raise argparse.ArgumentTypeError("readable_dir:{0} is not a readable dir".format(prospective_dir))
|
raise argparse.ArgumentTypeError("readable_dir:{0} is not a readable dir".format(prospective_dir))
|
||||||
|
|
||||||
|
|
||||||
|
# check distance values are valid
|
||||||
|
class distance_values(argparse.Action):
|
||||||
|
def __call__(self, parser, namespace, values, option_string=None):
|
||||||
|
m = re.match(r'^(\d+)\s?(m|km)$', values)
|
||||||
|
if m:
|
||||||
|
# convert to int in meters
|
||||||
|
values = int(m.group(1))
|
||||||
|
if m.group(2) == 'km':
|
||||||
|
values *= 1000
|
||||||
|
setattr(namespace, self.dest, values)
|
||||||
|
else:
|
||||||
|
raise argparse.ArgumentTypeError("distance_values:{0} is not a valid argument".format(values))
|
||||||
|
|
||||||
|
|
||||||
# MAIN FUNCTIONS
|
# MAIN FUNCTIONS
|
||||||
|
|
||||||
# METHOD: reverseGeolocate
|
# METHOD: reverseGeolocate
|
||||||
@@ -114,7 +128,7 @@ def reverseGeolocateInit(longitude, latitude):
|
|||||||
'error_message': ''
|
'error_message': ''
|
||||||
}
|
}
|
||||||
# error if long/lat is not valid
|
# error if long/lat is not valid
|
||||||
latlong_re = re.compile('^\d+\.\d+$')
|
latlong_re = re.compile(r'^\d+\.\d+$')
|
||||||
if not latlong_re.match(str(longitude)) or not latlong_re.match(str(latitude)):
|
if not latlong_re.match(str(longitude)) or not latlong_re.match(str(latitude)):
|
||||||
geolocation['status'] = 'ERROR'
|
geolocation['status'] = 'ERROR'
|
||||||
geolocation['error_message'] = 'Latitude {} or Longitude {} are not valid'.format(latitude, longitude)
|
geolocation['error_message'] = 'Latitude {} or Longitude {} are not valid'.format(latitude, longitude)
|
||||||
@@ -187,7 +201,7 @@ def reverseGeolocateOpenStreetMap(longitude, latitude):
|
|||||||
# dict with location, city, state, country, country code
|
# dict with location, city, state, country, country code
|
||||||
# if not fillable, entry is empty
|
# if not fillable, entry is empty
|
||||||
# SAMPLE: http://maps.googleapis.com/maps/api/geocode/json?latlng=<latitude>,<longitude>&language=<lang>&sensor=false&key=<api key>
|
# SAMPLE: http://maps.googleapis.com/maps/api/geocode/json?latlng=<latitude>,<longitude>&language=<lang>&sensor=false&key=<api key>
|
||||||
def reverseGeolocateGoogle(longitude, latitude):
|
def reverseGeolocateGoogle(longitude, latitude): # noqa: C901
|
||||||
# init
|
# init
|
||||||
geolocation = reverseGeolocateInit(longitude, latitude)
|
geolocation = reverseGeolocateInit(longitude, latitude)
|
||||||
temp_geolocation = geolocation.copy()
|
temp_geolocation = geolocation.copy()
|
||||||
@@ -224,7 +238,7 @@ def reverseGeolocateGoogle(longitude, latitude):
|
|||||||
'CountryCode': ['country'],
|
'CountryCode': ['country'],
|
||||||
'Country': ['country'],
|
'Country': ['country'],
|
||||||
'State': ['administrative_area_level_1', 'administrative_area_level_2'],
|
'State': ['administrative_area_level_1', 'administrative_area_level_2'],
|
||||||
'City': ['locality'],
|
'City': ['locality', 'administrative_area_level_3'],
|
||||||
'Location': ['sublocality_level_1', 'sublocality_level_2', 'route'],
|
'Location': ['sublocality_level_1', 'sublocality_level_2', 'route'],
|
||||||
}
|
}
|
||||||
# print("Error: {}".format(response.json()['status']))
|
# print("Error: {}".format(response.json()['status']))
|
||||||
@@ -275,7 +289,6 @@ def reverseGeolocateGoogle(longitude, latitude):
|
|||||||
geolocation['error_message'] = response.json()['error_message']
|
geolocation['error_message'] = response.json()['error_message']
|
||||||
geolocation['status'] = response.json()['status']
|
geolocation['status'] = response.json()['status']
|
||||||
print("Error in request: {} {}".format(geolocation['status'], geolocation['error_message']))
|
print("Error in request: {} {}".format(geolocation['status'], geolocation['error_message']))
|
||||||
|
|
||||||
# return
|
# return
|
||||||
return geolocation
|
return geolocation
|
||||||
|
|
||||||
@@ -315,7 +328,7 @@ def convertLongToDMS(lat_long):
|
|||||||
# number used in google/lr internal
|
# number used in google/lr internal
|
||||||
def longLatReg(longitude, latitude):
|
def longLatReg(longitude, latitude):
|
||||||
# regex
|
# regex
|
||||||
latlong_re = re.compile('^(\d+),(\d+\.\d+)([NESW]{1})$')
|
latlong_re = re.compile(r'^(\d+),(\d+\.\d+)([NESW]{1})$')
|
||||||
# dict for loop
|
# dict for loop
|
||||||
lat_long = {
|
lat_long = {
|
||||||
'longitude': longitude,
|
'longitude': longitude,
|
||||||
@@ -343,6 +356,27 @@ def convertDMStoLong(lat_long):
|
|||||||
return longLatReg(lat_long, '0,0.0N')['longitude']
|
return longLatReg(lat_long, '0,0.0N')['longitude']
|
||||||
|
|
||||||
|
|
||||||
|
# METHOD: getDistance
|
||||||
|
# PARAMS: from long/lat, to long_lat
|
||||||
|
# RETURN: distance in meters
|
||||||
|
# DESC : calculates the difference between two coordinates
|
||||||
|
def getDistance(from_longitude, from_latitude, to_longitude, to_latitude):
|
||||||
|
# earth radius in meters
|
||||||
|
earth_radius = 6378137.0
|
||||||
|
# convert all from radians with pre convert DMS to long and to float
|
||||||
|
from_longitude = radians(float(convertDMStoLong(from_longitude)))
|
||||||
|
from_latitude = radians(float(convertDMStoLat(from_latitude)))
|
||||||
|
to_longitude = radians(float(convertDMStoLong(to_longitude)))
|
||||||
|
to_latitude = radians(float(convertDMStoLat(to_latitude)))
|
||||||
|
# distance from - to
|
||||||
|
distance_longitude = from_longitude - to_longitude
|
||||||
|
distance_latitude = from_latitude - to_latitude
|
||||||
|
# main distance calculation
|
||||||
|
distance = sin(distance_latitude / 2)**2 + cos(from_latitude) * cos(to_latitude) * sin(distance_longitude / 2)**2
|
||||||
|
distance = 2 * atan2(sqrt(distance), sqrt(1 - distance))
|
||||||
|
return earth_radius * distance
|
||||||
|
|
||||||
|
|
||||||
# METHOD: checkOverwrite
|
# METHOD: checkOverwrite
|
||||||
# PARAMS: data: value field, key: XMP key, field_controls: array from args
|
# PARAMS: data: value field, key: XMP key, field_controls: array from args
|
||||||
# RETURN: true/false
|
# RETURN: true/false
|
||||||
@@ -486,7 +520,7 @@ def formatLen(string, length):
|
|||||||
# RETURN: number found in the BK string or 0 for none
|
# RETURN: number found in the BK string or 0 for none
|
||||||
# DESC : gets the BK number for sorting in the file list
|
# DESC : gets the BK number for sorting in the file list
|
||||||
def fileSortNumber(file):
|
def fileSortNumber(file):
|
||||||
m = re.match('.*\.BK\.(\d+)\.xmp$', file)
|
m = re.match(r'.*\.BK\.(\d+)\.xmp$', file)
|
||||||
return int(m.group(1)) if m is not None else 0
|
return int(m.group(1)) if m is not None else 0
|
||||||
|
|
||||||
|
|
||||||
@@ -610,125 +644,178 @@ parser = argparse.ArgumentParser(
|
|||||||
|
|
||||||
# xmp folder (or folders), or file (or files)
|
# xmp folder (or folders), or file (or files)
|
||||||
# note that the target directory or file needs to be writeable
|
# note that the target directory or file needs to be writeable
|
||||||
parser.add_argument('-i', '--include-source',
|
parser.add_argument(
|
||||||
required=True,
|
'-i',
|
||||||
nargs='*',
|
'--include-source',
|
||||||
action=writable_dir_folder,
|
required=True,
|
||||||
dest='xmp_sources',
|
nargs='*',
|
||||||
metavar='XMP SOURCE FOLDER',
|
action=writable_dir_folder,
|
||||||
help='The source folder or folders with the XMP files that need reverse geo encoding to be set. Single XMP files can be given here'
|
dest='xmp_sources',
|
||||||
)
|
metavar='XMP SOURCE FOLDER',
|
||||||
|
help='The source folder or folders with the XMP files that need reverse geo encoding to be set. Single XMP files can be given here'
|
||||||
|
)
|
||||||
# exclude folders
|
# exclude folders
|
||||||
parser.add_argument('-x', '--exclude-source',
|
parser.add_argument(
|
||||||
nargs='*',
|
'-x',
|
||||||
action=writable_dir_folder,
|
'--exclude-source',
|
||||||
dest='exclude_sources',
|
nargs='*',
|
||||||
metavar='EXCLUDE XMP SOURCE FOLDER',
|
action=writable_dir_folder,
|
||||||
help='Folders and files that will be excluded.'
|
dest='exclude_sources',
|
||||||
)
|
metavar='EXCLUDE XMP SOURCE FOLDER',
|
||||||
|
help='Folders and files that will be excluded.'
|
||||||
|
)
|
||||||
|
|
||||||
# LR database (base folder)
|
# LR database (base folder)
|
||||||
# get .lrcat file in this folder
|
# get .lrcat file in this folder
|
||||||
parser.add_argument('-l', '--lightroom',
|
parser.add_argument(
|
||||||
# required=True,
|
'-l',
|
||||||
action=readable_dir,
|
'--lightroom',
|
||||||
dest='lightroom_folder',
|
# required=True,
|
||||||
metavar='LIGHTROOM FOLDER',
|
action=readable_dir,
|
||||||
help='Lightroom catalogue base folder'
|
dest='lightroom_folder',
|
||||||
)
|
metavar='LIGHTROOM FOLDER',
|
||||||
|
help='Lightroom catalogue base folder'
|
||||||
|
)
|
||||||
|
|
||||||
# strict LR check with base path next to the file base name
|
# strict LR check with base path next to the file base name
|
||||||
parser.add_argument('-s', '--strict',
|
parser.add_argument(
|
||||||
dest='lightroom_strict',
|
'-s',
|
||||||
action='store_true',
|
'--strict',
|
||||||
help='Do strict check for Lightroom files including Path in query'
|
dest='lightroom_strict',
|
||||||
)
|
action='store_true',
|
||||||
|
help='Do strict check for Lightroom files including Path in query'
|
||||||
|
)
|
||||||
|
|
||||||
# set behaviour override
|
# set behaviour override
|
||||||
# FLAG: default: only set not filled
|
# FLAG: default: only set not filled
|
||||||
# other: overwrite all or overwrite if one is missing, overwrite specifc field (as defined below)
|
# other: overwrite all or overwrite if one is missing, overwrite specifc field (as defined below)
|
||||||
# fields: Location, City, State, Country, CountryCode
|
# fields: Location, City, State, Country, CountryCode
|
||||||
parser.add_argument('-f', '--field',
|
parser.add_argument(
|
||||||
action='append',
|
'-f',
|
||||||
type=str.lower, # make it lowercase for check
|
'--field',
|
||||||
choices=['overwrite', 'location', 'city', 'state', 'country', 'countrycode'],
|
action='append',
|
||||||
dest='field_controls',
|
type=str.lower, # make it lowercase for check
|
||||||
metavar='<overwrite, location, city, state, country, countrycode>',
|
choices=['overwrite', 'location', 'city', 'state', 'country', 'countrycode'],
|
||||||
help='On default only set fields that are not set yet. Options are: '\
|
dest='field_controls',
|
||||||
'Overwrite (write all new), Location, City, State, Country, CountryCode. '\
|
metavar='<overwrite, location, city, state, country, countrycode>',
|
||||||
'Multiple can be given for combination overwrite certain fields only or set only certain fields. '\
|
help='On default only set fields that are not set yet. Options are: '\
|
||||||
'If with overwrite the field will be overwritten if already set, else it will be always skipped.'
|
'Overwrite (write all new), Location, City, State, Country, CountryCode. '\
|
||||||
)
|
'Multiple can be given for combination overwrite certain fields only or set only certain fields. '\
|
||||||
|
'If with overwrite the field will be overwritten if already set, else it will be always skipped.'
|
||||||
|
)
|
||||||
|
|
||||||
|
parser.add_argument(
|
||||||
|
'-d',
|
||||||
|
'--fuzzy-cache',
|
||||||
|
type=str.lower,
|
||||||
|
action=distance_values,
|
||||||
|
nargs='?',
|
||||||
|
const='10m', # default is 10m
|
||||||
|
dest='fuzzy_distance',
|
||||||
|
metavar='FUZZY DISTANCE',
|
||||||
|
help='Allow fuzzy distance cache lookup. Optional distance can be given, '\
|
||||||
|
'if not set default of 10m is used. '\
|
||||||
|
'Allowed argument is in the format of 12m or 12km'
|
||||||
|
)
|
||||||
|
|
||||||
# Google Maps API key to overcome restrictions
|
# Google Maps API key to overcome restrictions
|
||||||
parser.add_argument('-g', '--google',
|
parser.add_argument(
|
||||||
dest='google_api_key',
|
'-g',
|
||||||
metavar='GOOGLE API KEY',
|
'--google',
|
||||||
help='Set a Google API Maps key to overcome the default lookup limitations'
|
dest='google_api_key',
|
||||||
)
|
metavar='GOOGLE API KEY',
|
||||||
|
help='Set a Google API Maps key to overcome the default lookup limitations'
|
||||||
|
)
|
||||||
|
|
||||||
# use open street maps
|
# use open street maps
|
||||||
parser.add_argument('-o', '--openstreetmap',
|
parser.add_argument(
|
||||||
dest='use_openstreetmap',
|
'-o',
|
||||||
action='store_true',
|
'--openstreetmap',
|
||||||
help='Use openstreetmap instead of Google'
|
dest='use_openstreetmap',
|
||||||
)
|
action='store_true',
|
||||||
|
help='Use openstreetmap instead of Google'
|
||||||
|
)
|
||||||
|
|
||||||
# email of open street maps requests
|
# email of open street maps requests
|
||||||
parser.add_argument('-e', '--email',
|
parser.add_argument(
|
||||||
dest='email',
|
'-e',
|
||||||
metavar='EMIL ADDRESS',
|
'--email',
|
||||||
help='An email address for OpenStreetMap'
|
dest='email',
|
||||||
)
|
metavar='EMIL ADDRESS',
|
||||||
|
help='An email address for OpenStreetMap'
|
||||||
|
)
|
||||||
|
|
||||||
# write api/email settings to config file
|
# write api/email settings to config file
|
||||||
parser.add_argument('-w', '--write-settings',
|
parser.add_argument(
|
||||||
dest='config_write',
|
'-w',
|
||||||
action='store_true',
|
'--write-settings',
|
||||||
help='Write Google API or OpenStreetMap email to config file'
|
dest='config_write',
|
||||||
)
|
action='store_true',
|
||||||
|
help='Write Google API or OpenStreetMap email to config file'
|
||||||
|
)
|
||||||
|
|
||||||
# only read data and print on screen, do not write anything
|
# only read data and print on screen, do not write anything
|
||||||
parser.add_argument('-r', '--read-only',
|
parser.add_argument(
|
||||||
dest='read_only',
|
'-r',
|
||||||
action='store_true',
|
'--read-only',
|
||||||
help='Read current values from the XMP file only, do not read from LR or lookup any data and write back'
|
dest='read_only',
|
||||||
)
|
action='store_true',
|
||||||
|
help='Read current values from the XMP file only, do not read from LR or lookup any data and write back'
|
||||||
|
)
|
||||||
|
|
||||||
# only list unset ones
|
# only list unset ones
|
||||||
parser.add_argument('-u', '--unset-only',
|
parser.add_argument(
|
||||||
dest='unset_only',
|
'-u',
|
||||||
action='store_true',
|
'--unset-only',
|
||||||
help='Only list unset XMP files'
|
dest='unset_only',
|
||||||
)
|
action='store_true',
|
||||||
|
help='Only list unset XMP files'
|
||||||
|
)
|
||||||
|
|
||||||
|
# only list unset GPS codes
|
||||||
|
parser.add_argument(
|
||||||
|
'-p',
|
||||||
|
'--unset-gps-only',
|
||||||
|
dest='unset_gps_only',
|
||||||
|
action='store_true',
|
||||||
|
help='Only list unset XMP files for GPS fields'
|
||||||
|
)
|
||||||
|
|
||||||
# don't try to do auto adjust in list view
|
# don't try to do auto adjust in list view
|
||||||
parser.add_argument('-a', '--no-autoadjust',
|
parser.add_argument(
|
||||||
dest='no_autoadjust',
|
'-a',
|
||||||
action='store_true',
|
'--no-autoadjust',
|
||||||
help='Don\'t try to auto adjust columns'
|
dest='no_autoadjust',
|
||||||
)
|
action='store_true',
|
||||||
|
help='Don\'t try to auto adjust columns'
|
||||||
|
)
|
||||||
|
|
||||||
# compact view, compresses columns down to a minimum
|
# compact view, compresses columns down to a minimum
|
||||||
parser.add_argument('-c', '--compact',
|
parser.add_argument(
|
||||||
dest='compact_view',
|
'-c',
|
||||||
action='store_true',
|
'--compact',
|
||||||
help='Very compact list view'
|
dest='compact_view',
|
||||||
)
|
action='store_true',
|
||||||
|
help='Very compact list view'
|
||||||
|
)
|
||||||
|
|
||||||
# Do not create backup files
|
# Do not create backup files
|
||||||
parser.add_argument('-n', '--nobackup',
|
parser.add_argument(
|
||||||
dest='no_xmp_backup',
|
'-n',
|
||||||
action='store_true',
|
'--nobackup',
|
||||||
help='Do not create a backup from the XMP file'
|
dest='no_xmp_backup',
|
||||||
)
|
action='store_true',
|
||||||
|
help='Do not create a backup from the XMP file'
|
||||||
|
)
|
||||||
|
|
||||||
# verbose args for more detailed output
|
# verbose args for more detailed output
|
||||||
parser.add_argument('-v', '--verbose',
|
parser.add_argument(
|
||||||
action='count',
|
'-v',
|
||||||
dest='verbose',
|
'--verbose',
|
||||||
help='Set verbose output level'
|
action='count',
|
||||||
)
|
dest='verbose',
|
||||||
|
help='Set verbose output level'
|
||||||
|
)
|
||||||
|
|
||||||
# debug flag
|
# debug flag
|
||||||
parser.add_argument('--debug', action='store_true', dest='debug', help='Set detailed debug output')
|
parser.add_argument('--debug', action='store_true', dest='debug', help='Set detailed debug output')
|
||||||
@@ -753,11 +840,12 @@ if not args.unset_only:
|
|||||||
args.unset_only = 0
|
args.unset_only = 0
|
||||||
|
|
||||||
if args.debug:
|
if args.debug:
|
||||||
print("### ARGUMENT VARS: I: {incl}, X: {excl}, L: {lr}, F: {fc}, M: {osm}, G: {gp}, E: {em}, R: {read}, U: {us}, A: {adj}, C: {cmp}, N: {nbk}, W: {wrc}, V: {v}, D: {d}, T: {t}".format(
|
print("### ARGUMENT VARS: I: {incl}, X: {excl}, L: {lr}, F: {fc}, D: {fdist}, M: {osm}, G: {gp}, E: {em}, R: {read}, U: {us}, A: {adj}, C: {cmp}, N: {nbk}, W: {wrc}, V: {v}, D: {d}, T: {t}".format(
|
||||||
incl=args.xmp_sources,
|
incl=args.xmp_sources,
|
||||||
excl=args.exclude_sources,
|
excl=args.exclude_sources,
|
||||||
lr=args.lightroom_folder,
|
lr=args.lightroom_folder,
|
||||||
fc=args.field_controls,
|
fc=args.field_controls,
|
||||||
|
fdist=args.fuzzy_distance,
|
||||||
osm=args.use_openstreetmap,
|
osm=args.use_openstreetmap,
|
||||||
gp=args.google_api_key,
|
gp=args.google_api_key,
|
||||||
em=args.email,
|
em=args.email,
|
||||||
@@ -790,7 +878,7 @@ if args.email and not args.use_openstreetmap:
|
|||||||
error = True
|
error = True
|
||||||
# if email and not basic valid email (@ .)
|
# if email and not basic valid email (@ .)
|
||||||
if args.email:
|
if args.email:
|
||||||
if not re.match('^.+@.+\.[A-Za-z]{1,}$', args.email):
|
if not re.match(r'^.+@.+\.[A-Za-z]{1,}$', args.email):
|
||||||
print("Not a valid email for OpenStreetMap: {}".format(args.email))
|
print("Not a valid email for OpenStreetMap: {}".format(args.email))
|
||||||
error = True
|
error = True
|
||||||
# on error exit here
|
# on error exit here
|
||||||
@@ -887,6 +975,7 @@ count = {
|
|||||||
'read': 0,
|
'read': 0,
|
||||||
'map': 0,
|
'map': 0,
|
||||||
'cache': 0,
|
'cache': 0,
|
||||||
|
'fuzzy_cache': 0,
|
||||||
'lightroom': 0,
|
'lightroom': 0,
|
||||||
'changed': 0,
|
'changed': 0,
|
||||||
'failed': 0,
|
'failed': 0,
|
||||||
@@ -932,6 +1021,8 @@ if args.lightroom_folder:
|
|||||||
cur = lrdb.cursor()
|
cur = lrdb.cursor()
|
||||||
# flag that we have Lightroom DB
|
# flag that we have Lightroom DB
|
||||||
use_lightroom = True
|
use_lightroom = True
|
||||||
|
if args.debug:
|
||||||
|
print("### USE Lightroom {}".format(use_lightroom))
|
||||||
|
|
||||||
# on error exit here
|
# on error exit here
|
||||||
if error:
|
if error:
|
||||||
@@ -955,8 +1046,8 @@ for xmp_file_source in args.xmp_sources:
|
|||||||
# 2) file is not in exclude list
|
# 2) file is not in exclude list
|
||||||
# 3) full folder is not in exclude list
|
# 3) full folder is not in exclude list
|
||||||
if file.endswith(".xmp") and ".BK." not in file \
|
if file.endswith(".xmp") and ".BK." not in file \
|
||||||
and "{}/{}".format(root, file) not in args.exclude_sources \
|
and "{}/{}".format(root, file) not in args.exclude_sources \
|
||||||
and root.rstrip('/') not in [x.rstrip('/') for x in args.exclude_sources]:
|
and root.rstrip('/') not in [x.rstrip('/') for x in args.exclude_sources]:
|
||||||
if "{}/{}".format(root, file) not in work_files:
|
if "{}/{}".format(root, file) not in work_files:
|
||||||
work_files.append("{}/{}".format(root, file))
|
work_files.append("{}/{}".format(root, file))
|
||||||
count['all'] += 1
|
count['all'] += 1
|
||||||
@@ -1040,7 +1131,7 @@ if args.read_only:
|
|||||||
|
|
||||||
# ### MAIN WORK LOOP
|
# ### MAIN WORK LOOP
|
||||||
# now we just loop through each file and work on them
|
# now we just loop through each file and work on them
|
||||||
for xmp_file in work_files:
|
for xmp_file in work_files: # noqa: C901
|
||||||
if not args.read_only:
|
if not args.read_only:
|
||||||
print("---> {}: ".format(xmp_file), end='')
|
print("---> {}: ".format(xmp_file), end='')
|
||||||
|
|
||||||
@@ -1054,12 +1145,16 @@ for xmp_file in work_files:
|
|||||||
# read fields from the XMP file and store in hash
|
# read fields from the XMP file and store in hash
|
||||||
xmp.parse_from_str(strbuffer)
|
xmp.parse_from_str(strbuffer)
|
||||||
for xmp_field in xmp_fields:
|
for xmp_field in xmp_fields:
|
||||||
data_set[xmp_field] = xmp.get_property(xmp_fields[xmp_field], xmp_field)
|
# need to check if propert exist or it will the exempi routine will fail
|
||||||
|
if xmp.does_property_exist(xmp_fields[xmp_field], xmp_field):
|
||||||
|
data_set[xmp_field] = xmp.get_property(xmp_fields[xmp_field], xmp_field)
|
||||||
|
else:
|
||||||
|
data_set[xmp_field] = ''
|
||||||
if args.debug:
|
if args.debug:
|
||||||
print("### => XMP: {}:{} => {}".format(xmp_fields[xmp_field], xmp_field, data_set[xmp_field]))
|
print("### => XMP: {}:{} => {}".format(xmp_fields[xmp_field], xmp_field, data_set[xmp_field]))
|
||||||
if args.read_only:
|
if args.read_only:
|
||||||
# view only if list all or if data is unset
|
# view only if list all or if data is unset
|
||||||
if not args.unset_only or (args.unset_only and '' in data_set.values()):
|
if (not args.unset_only and not args.unset_gps_only) or (args.unset_only and '' in data_set.values()) or (args.unset_gps_only and (not data_set['GPSLatitude'] or not data_set['GPSLongitude'])):
|
||||||
# for read only we print out the data formatted
|
# for read only we print out the data formatted
|
||||||
# headline check, do we need to print that
|
# headline check, do we need to print that
|
||||||
count['read'] = printHeader(header_line.format(page_no=page_no, page_all=page_all), count['read'], header_repeat)
|
count['read'] = printHeader(header_line.format(page_no=page_no, page_all=page_all), count['read'], header_repeat)
|
||||||
@@ -1140,23 +1235,59 @@ for xmp_file in work_files:
|
|||||||
# run this through the overwrite checker to get unset if we have a forced overwrite
|
# run this through the overwrite checker to get unset if we have a forced overwrite
|
||||||
has_unset = False
|
has_unset = False
|
||||||
failed = False
|
failed = False
|
||||||
|
from_cache = False
|
||||||
for loc in data_set_loc:
|
for loc in data_set_loc:
|
||||||
if checkOverwrite(data_set[loc], loc, args.field_controls):
|
if checkOverwrite(data_set[loc], loc, args.field_controls):
|
||||||
has_unset = True
|
has_unset = True
|
||||||
if has_unset:
|
if has_unset:
|
||||||
# check if lat/long is in cache
|
# check if lat/long is in cache
|
||||||
cache_key = '{}.#.{}'.format(data_set['GPSLatitude'], data_set['GPSLongitude'])
|
cache_key = '{}#{}'.format(data_set['GPSLongitude'], data_set['GPSLatitude'])
|
||||||
if args.debug:
|
if args.debug:
|
||||||
print("### *** CACHE: {}: {}".format(cache_key, 'NO' if cache_key not in data_cache else 'YES'))
|
print("### *** CACHE: {}: {}".format(cache_key, 'NO' if cache_key not in data_cache else 'YES'))
|
||||||
|
# main chache check = identical
|
||||||
|
# second cache level check is on distance:
|
||||||
|
# default distance is 10m, can be set via flag
|
||||||
|
# check distance to previous cache entries (reverse newest to oldest) and match before we do google lookup
|
||||||
if cache_key not in data_cache:
|
if cache_key not in data_cache:
|
||||||
# get location from maps (google or openstreetmap)
|
has_fuzzy_cache = False
|
||||||
maps_location = reverseGeolocate(latitude=data_set['GPSLatitude'], longitude=data_set['GPSLongitude'], map_type=map_type)
|
if args.fuzzy_distance:
|
||||||
# cache data with Lat/Long
|
shortest_distance = args.fuzzy_distance
|
||||||
data_cache[cache_key] = maps_location
|
best_match_latlong = ''
|
||||||
|
# check if we have fuzzy distance, if no valid found do maps lookup
|
||||||
|
for _cache_key in data_cache:
|
||||||
|
# split up cache key so we can use in the distance calc method
|
||||||
|
to_lat_long = _cache_key.split('#')
|
||||||
|
# get the distance based on current set + cached set
|
||||||
|
# print("Lookup f-long {} f-lat {} t-long {} t-lat {}".format(data_set['GPSLongitude'], data_set['GPSLatitude'], to_lat_long[0], to_lat_long[1]))
|
||||||
|
distance = getDistance(from_longitude=data_set['GPSLongitude'], from_latitude=data_set['GPSLatitude'], to_longitude=to_lat_long[0], to_latitude=to_lat_long[1])
|
||||||
|
if args.debug:
|
||||||
|
print("### **= FUZZY CACHE: => distance: {} (m), shortest: {}".format(distance, shortest_distance))
|
||||||
|
if distance <= shortest_distance:
|
||||||
|
# set new distance and keep current best matching location
|
||||||
|
shortest_distance = distance
|
||||||
|
best_match_latlong = _cache_key
|
||||||
|
has_fuzzy_cache = True
|
||||||
|
if args.debug:
|
||||||
|
print("### ***= FUZZY CACHE: YES => Best match: {}".format(best_match_latlong))
|
||||||
|
if not has_fuzzy_cache:
|
||||||
|
# get location from maps (google or openstreetmap)
|
||||||
|
maps_location = reverseGeolocate(latitude=data_set['GPSLatitude'], longitude=data_set['GPSLongitude'], map_type=map_type)
|
||||||
|
# cache data with Lat/Long
|
||||||
|
data_cache[cache_key] = maps_location
|
||||||
|
from_cache = False
|
||||||
|
else:
|
||||||
|
maps_location = data_cache[best_match_latlong]
|
||||||
|
# cache this one, because the next one will match this one too
|
||||||
|
# we don't need to loop search again for the same fuzzy location
|
||||||
|
data_cache[cache_key] = maps_location
|
||||||
|
count['cache'] += 1
|
||||||
|
count['fuzzy_cache'] += 1
|
||||||
|
from_cache = True
|
||||||
else:
|
else:
|
||||||
# load location from cache
|
# load location from cache
|
||||||
maps_location = data_cache[cache_key]
|
maps_location = data_cache[cache_key]
|
||||||
count['cache'] += 1
|
count['cache'] += 1
|
||||||
|
from_cache = True
|
||||||
# overwrite sets (note options check here)
|
# overwrite sets (note options check here)
|
||||||
if args.debug:
|
if args.debug:
|
||||||
print("### Map Location ({}): {}".format(map_type, maps_location))
|
print("### Map Location ({}): {}".format(map_type, maps_location))
|
||||||
@@ -1199,8 +1330,11 @@ for xmp_file in work_files:
|
|||||||
with open(xmp_file, 'w') as fptr:
|
with open(xmp_file, 'w') as fptr:
|
||||||
fptr.write(xmp.serialize_to_str(omit_packet_wrapper=True))
|
fptr.write(xmp.serialize_to_str(omit_packet_wrapper=True))
|
||||||
else:
|
else:
|
||||||
print("[TEST] Would write {} ".format(data_set, xmp_file), end='')
|
print("[TEST] Would write {} {}".format(data_set, xmp_file), end='')
|
||||||
print("[UPDATED]")
|
if from_cache:
|
||||||
|
print("[UPDATED FROM CACHE]")
|
||||||
|
else:
|
||||||
|
print("[UPDATED]")
|
||||||
count['changed'] += 1
|
count['changed'] += 1
|
||||||
elif failed:
|
elif failed:
|
||||||
print("[FAILED]")
|
print("[FAILED]")
|
||||||
@@ -1216,23 +1350,24 @@ if use_lightroom:
|
|||||||
lrdb.close()
|
lrdb.close()
|
||||||
|
|
||||||
# end stats only if we write
|
# end stats only if we write
|
||||||
print("{}".format('=' * 39))
|
print("{}".format('=' * 40))
|
||||||
print("XMP Files found : {:9,}".format(count['all']))
|
print("XMP Files found : {:9,}".format(count['all']))
|
||||||
if args.read_only:
|
if args.read_only:
|
||||||
print("XMP Files listed : {:9,}".format(count['listed']))
|
print("XMP Files listed : {:9,}".format(count['listed']))
|
||||||
if not args.read_only:
|
if not args.read_only:
|
||||||
print("Updated : {:9,}".format(count['changed']))
|
print("Updated : {:9,}".format(count['changed']))
|
||||||
print("Skipped : {:9,}".format(count['skipped']))
|
print("Skipped : {:9,}".format(count['skipped']))
|
||||||
print("New GeoLocation from Map : {:9,}".format(count['map']))
|
print("New GeoLocation from Map : {:9,}".format(count['map']))
|
||||||
print("GeoLocation from Cache : {:9,}".format(count['cache']))
|
print("GeoLocation from Cache : {:9,}".format(count['cache']))
|
||||||
print("Failed reverse GeoLocate : {:9,}".format(count['failed']))
|
print("GeoLocation from Fuzzy Cache : {:9,}".format(count['fuzzy_cache']))
|
||||||
|
print("Failed reverse GeoLocate : {:9,}".format(count['failed']))
|
||||||
if use_lightroom:
|
if use_lightroom:
|
||||||
print("GeoLocaction from Lightroom : {:9,}".format(count['lightroom']))
|
print("GeoLocaction from Lightroom : {:9,}".format(count['lightroom']))
|
||||||
print("No Lightroom data found : {:9,}".format(count['not_found']))
|
print("No Lightroom data found : {:9,}".format(count['not_found']))
|
||||||
print("More than one found in LR : {:9,}".format(count['many_found']))
|
print("More than one found in LR : {:9,}".format(count['many_found']))
|
||||||
# if we have failed data
|
# if we have failed data
|
||||||
if len(failed_files) > 0:
|
if len(failed_files) > 0:
|
||||||
print("{}".format('-' * 39))
|
print("{}".format('-' * 40))
|
||||||
print("Files that failed to update:")
|
print("Files that failed to update:")
|
||||||
print("{}".format(', '.join(failed_files)))
|
print("{}".format(', '.join(failed_files)))
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user