I’m recording videos with a Gopro 9 on my bike, then extracting images with the CLI tools and using the GPS track from my phone (a lot more accurate than the internal Gopro GPS).
Gopro creates a new .MP4 file every 4GB, so a long video is split up into (after my renaming):
20220415_01_01.MP4
20220415_01_02.MP4
20220415_01_03.MP4
Aim:
Extract images of all the video segments in one go while also detecting duplicates from GPS (standing at a red light with a bike, see this thread about duplicates).
First Method tried:
mapillary_tools --verbose video_process "path_to_folder_containing_videos" \
--geotag_source "gpx" \
--geotag_source_path "path_to_gpx_file" \
--interpolation_use_gpx_start_time \
--video_sample_interval 0.5 \
--overwrite_all_EXIF_tags \
--duplicate_distance 3 \
--duplicate_angle 359
Result:
- the second video segment is mapped onto the same GPS coordinates as the first video
- duplicates are identified correctly though and not chosen for upload when put into the Mapillary desktop uploader
Second method tried:
- Concatenate all the segments into one long video, using
ffmpeg
, then process as above.
Problem: I getMapillaryVideoError: Failed to find video creation_time in [video_path]
.
As a workaround: - Specify
--video_start_time
manually instead of--interpolation_use_gpx_start_time
but that takes some tries because it seems the Gopro internal time (even though set to the same time as my phone) shifts a bit over time, so it’s not the same for videos taken on different days. -
MapillaryOutsideGPXTrackError
comes up, so I usedexiftool
to add the GPS info:
exiftool -geotag "path_to_gpx_file" path_to_video_file
Result:
- GPS info in the image metadata is correct (if
video_start_time
is accurate) - for duplicate detection though, I’d have to process again since the images only have GPS info now
In a nutshell, I’m missing something in method 1, and method 2 looks to get messy/cumbersome.
Any hints for making either work are very much appreciated!
I’m sitting on some 600km of footage that I want to process and upload…
Thanks,
Paul