> Hi Brian, want to import 10GB csv file into the Prometheus, after that try to run different queries to find out how it performs with data with high cardinality.
In prometheus, the timeseries data consists of float values and there's no "cardinality" as such. But each timeseries is determined by its unique set of labels, and if those labels have high cardinality, it will perform very poorly (due to an explosion in the number of distinct timeseries).
> Now which option more suitable? And faster?
More suitable for ingestion into Prometheus? Backfilling via OpenMetrics format is the only approach.
More suitable for your application? I don't know what that application is, or anything about the data you're importing, so I can't really say.
If you have high cardinality and/or non-numeric data then you might want to look at logging systems (e.g. Loki, VictoriaLogs), document databases (e.g. OpenSearch/ElasticSearch, MongoDB), columnar databases (e.g. Clickhouse, Druid) or various other "analytics/big data" platforms.