It took a while for this to show on on groups, resulting in me going to sleep before I could update this.
It seems that rails actually cache's active record queries? Or postgres? Here is an example of what I am referring to.
If this is true, I might not even have to really worry that much about doing this that much!
Anyways, after doing some experimentation within the rails console (hoping to avoid possible rails caching) and using what Fredrick recommended, here are my results.
TimedAsset.order("id").pluck(:id).last(30) <-- my old method (62.5, 16.6, 18.5, 15.6, 43.6, 15.5, 17.5)
SELECT "timed_assets"."id" FROM "timed_assets" ORDER BY id
TimedAsset.order("id desc").limit(30).pluck(:id) <-- frederick's method (1.0, 1.6, 0.9, 0.9, 0.9, 12.4, 1.6)
SELECT "timed_assets"."id" FROM "timed_assets" ORDER BY id desc LIMIT 200
TimedAsset.order("id").pluck(:id).last(200) <-- my old method (16.8, 18.5, 15.5, 15.6, 47.7, 17.7, 17.8)
SELECT "timed_assets"."id" FROM "timed_assets" ORDER BY id
TimedAsset.order("id desc").limit(200).pluck(:id) <-- frederick's method (1.1, 1.7, 1.7, 1.0, 1.7, 1.9, 1.8)
SELECT "timed_assets"."id" FROM "timed_assets" ORDER BY id desc LIMIT 200
I should look at the postgres/rails documentation more to see how limit and pluck actually work! From what I see, my method orders all the rows within postgres, postgres gets the ID column, and then rails throws out everything except the last 200 rows. Frederick's method does the same, but the extra rows are thrown out within postgres instead. I assume that the speed bump is because postgres handles the disposal of the extra rails?
Anyways, thank you a whole bunch for the help frederick!