Hi, 
I am playing the framework and I am trying to understand a bit the  SCD part of it.
My plan is to export from a mysql to a File. Then do the transform from a File to another File , and finally write to a postgresql (using bulk upload).
1) Does Bulk upload manages SCD? or this is not even an option in this case?
If it does manage: 
I have the following output destination
destination :out, {
  :file => "../tmp/t_board_accounts.csv",
  :append => false
},
{
  :order => [:key, :id, :crawling_enabled, :max_candidates_per_day],
  :virtual => {
    :key =>  ETL::Generator::SurrogateKeyGenerator.new(
      :query => 'SELECT MAX(board_account_key) FROM board_accounts',
      :target => :datawarehouse
    )
  }
}
2) But I see duplicate everytime I run it, is there something I need to do to prevent duplicate?
3) when I specificy the following directive, I see nothing outputed in the file nor in the database ever (even after truncating the final destination and rerunning it)....
destination :out, {
  :file => "../tmp/t_board_accounts.csv",
  :append => false,
  :natural_key => [ :id ],
  :scd_fields => [ :crawling_enabled,
                   :max_candidates_per_day],
  :scd => {
    :dimension_target => :datawarehouse,
    :dimension_table => "board_accounts",
    :type => 2
  }
},
{
  :order => [:key, :id, :crawling_enabled, :max_candidates_per_day],
  :virtual => {
    :key =>  ETL::Generator::SurrogateKeyGenerator.new(
      :query => 'SELECT MAX(board_account_key) FROM board_accounts',
      :target => :datawarehouse
    )
  }
}
Thanks
Emmanuel