A follow up to a post i made a few months ago - on how to use the 'color dna' of a film to disambiguate, and others I've made about how to fiddle around with color in Apps Script.
In Making a film color DNA I described how I was using both color strips and labelled content to find videos that were duplicates or similar. I’m already using ElasticSearch to disambiguate films using labels and object tracking artefacts identified by the Google Video Intelligence API. The next step is to enhance that similarity matching using the color strips of a film as finger prints.
This article also covers how to create vector embeddings which can be used for nearest neighbor similarity searches
article link - https://ramblings.mcpher.com/vector-embeddings-in-elastic-search/