Ansible task execution until it completes

153 views
Skip to first unread message

Aharonu

unread,
Jun 20, 2023, 12:48:51 PM6/20/23
to ansible...@googlegroups.com, Todd Lewis, Vladimir Botka
Dear All,

Could you please help with the below query.

I have variable_data which contains 3 records. I want to keep run debug task continuously every 1 minute until percent_complete == 100 and then complete task execution.

Please help to achieve my requirement. Thank you  in advance.

TASK [debug] ***************************************************
ok: [cluster1] => {
    "variable_data.msg": {
        "num_records": 3,
        "records": [
            {
                "percent_complete": 20,
                "state": "in-progress",
                "volume": "vol1",
                "vserver": "svm1"
            },
            {
                "percent_complete": 50,
                "state": "in-progress",
                "volume": "vol2",
                "vserver": "svm1"
            },
            {
                "percent_complete": 80,
                "state": "in-progress",
                "volume": "vol3",
                "vserver": "svm2"
            }
        ]
    }
}



Dick Visser

unread,
Jun 20, 2023, 12:51:58 PM6/20/23
to ansible...@googlegroups.com, Todd Lewis, Vladimir Botka
This is not possible, because percent_complete is only 20, 60, or 80.
100 is not there. So whatever this means, it won't finish.

--
You received this message because you are subscribed to the Google Groups "Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ansible-proje...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ansible-project/CANGEjuXPNuy2guY3VRyE0yj7tPz-TrOORp5VCvaKBgfD9qyPcQ%40mail.gmail.com.

Aharonu

unread,
Jun 20, 2023, 1:00:35 PM6/20/23
to ansible...@googlegroups.com, Todd Lewis, Vladimir Botka
I have another task (which will run long time) top of it where based on that this percent_complete will be changed and that date stored in variable_data.

I want to run debug task every 1 min to check percent_complete and if records shows percent_complete is 100 and then exit the play.

Aharonu

unread,
Jun 20, 2023, 2:27:13 PM6/20/23
to ansible...@googlegroups.com, Todd Lewis, Vladimir Botka
The play (debug) should execute every 1min and percent_complete will get changed every time till 100.

When percent_complete is reached 100 for every record in the variable_data, the play should exit.

Please help to achieve. Thank You.

Kosala Atapattu

unread,
Jun 20, 2023, 4:02:50 PM6/20/23
to ansible...@googlegroups.com, Todd Lewis, Vladimir Botka
Have you tried something like following?

- name: wait for a url to work
ansible.builtin.get_url:
url: "https://myservice/"
dest: "/tmp/"
register: result
retries: 10
delay: 6
until: result.rc == 0

SInce your record is a json you should be able to pull that off easily.

Cheers,
Kosala




--
You received this message because you are subscribed to the Google Groups "Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ansible-proje...@googlegroups.com.

Vladimir Botka

unread,
Jun 21, 2023, 2:36:39 AM6/21/23
to Aharonu, ansible...@googlegroups.com
On Tue, 20 Jun 2023 22:18:22 +0530
Aharonu <ahar...@gmail.com> wrote:

> I have *variable_data *which contains 3 records. I want to keep run
> *debug *task
> continuously every 1 minute until *percent_complete* == 100 and then
> complete task execution.

Given the list for testing

variable_data:
- {percent_complete: 100, volume: vol1, vserver: svm1}
- {percent_complete: 100, volume: vol2, vserver: svm1}
- {percent_complete: 100, volume: vol3, vserver: svm2}

a task will complete when all *percent_complete* == 100

data_done: "{{ variable_data|
selectattr('percent_complete', 'ne', 100)|
length == 0 }}"

Without knowing the source of *variable_data* it's not possible to
tell you how such a task should look like. Generally, you should test
*data_done*, set delay, and how many times you want to retry

until: data_done
delay: 60
retries: 999

See "Retrying a task until a condition is met*
https://docs.ansible.com/ansible/latest/playbook_guide/playbooks_loops.html#retrying-a-task-until-a-condition-is-met

Moving forward you should briefly describe at least: the inventory,
the source of the items(tasks, processes, ...) to be completed, and
how you consolidate *variable_data*.

Make it "Minimal reproducible example". See
https://en.wikipedia.org/wiki/Minimal_reproducible_example


--
Vladimir Botka

Aharonu

unread,
Jun 21, 2023, 7:08:46 AM6/21/23
to Vladimir Botka, ansible...@googlegroups.com
Hi Vladimir,

Per suggestion, i have added the mentioned variable but still something is missing and i got below error:

The conditional check 'data_done' failed. The error was: Unexpected templating type error occurred on ({{ variable_data| selectattr('percent_complete', 'ne', 100) | length == 0 }}): object of type 'generator' has no len()


Could you please assist?  Thank you


This playbook is for NetApp storage module. moule 'na_ontap_volume' using for moving volume data from one aggregate(pool) to another.
Once  'Volume move' task is executed, I need to keep checking the status of volume move until 100% completed ( percent_complete ) in the list.

---
- hosts: all
  connection: local
  gather_facts: false
  collections:
    - netapp.ontap
  vars:
    login: &login
      username: "{{ admin_username }}"
      password: "{{ admin_password }}"
      https: true
      validate_certs: false
      use_rest: Always
  tasks:
  - name: block section task
    block:
      - name: Volume move
        na_ontap_volume:
          <<: *login
          aggregate_name: "{{ aggregate_name }}"
          vserver: svm1
          name: vol1
          hostname: "{{ inventory_hostname }}"
        register: ontap_volume
        no_log: false
      - name: run ontap rest cli command to check volume move status
        netapp.ontap.na_ontap_rest_cli:
          <<: *login
          hostname: "{{ inventory_hostname }}"
          command: 'volume/move?fields=percent-complete,state'
          params:
            volume: '*'
            vserver: '*'
          verb: 'GET'
        register: variable_data
      - debug: var=variable_data   # output mentioned below

      - set_fact:      
          data_done: "{{ variable_data| selectattr('percent_complete', 'ne', 100)| len() == 0 }}"
      - name: Retry a task until a certain condition is met
        debug:
      # register:
        until: data_done
        retries: 5
        delay: 10



the output of "variable_data"

TASK [debug] ****************
ok: [cluster1] => {
    "variable_data": {
        "changed": true,
        "failed": false,
        "msg": {
            "num_records": 3,
            "records": [
                {
                    "percent_complete": 30,
                    "state": "done",

                    "volume": "vol1",
                    "vserver": "svm1"
                },
                {
                    "percent_complete": 98,
                    "state": "healthy",

                    "volume": "vol2",
                    "vserver": "svm1"
                },
                {
                    "percent_complete": 50,
                    "state": "done",
                    "volume": "vol1",
                    "vserver": "svm2"
                }
            ]
        }
    }
}

Vladimir Botka

unread,
Jun 21, 2023, 12:29:54 PM6/21/23
to Aharonu, ansible...@googlegroups.com
On Wed, 21 Jun 2023 11:08:16 +0000
Aharonu <ahar...@gmail.com> wrote:

> The conditional check 'data_done' failed. The error was: Unexpected
> templating type error occurred on ({{ variable_data|
> selectattr('percent_complete', 'ne', 100) | length == 0 }}): object of type
> 'generator' has no len()

Add the explicit conversion to *list*

data_done: "{{ variable_data|
selectattr('percent_complete', 'ne', 100)|
list|length == 0 }}"

This has already been fixed. If you can update the latest version.

--
Vladimir Botka

Aharonu

unread,
Jun 21, 2023, 2:00:17 PM6/21/23
to Vladimir Botka, ansible...@googlegroups.com
Hi Vladimir,

Thank you.

I will check in higher version and update the status how it works.

Aharonu

unread,
Jun 21, 2023, 3:34:51 PM6/21/23
to Vladimir Botka, ansible...@googlegroups.com
Hi Vladimir,

I have added |list and execute below task in ansible [core 2.11.12] but no luck. Not getting where is stopping.

Error:

TASK [set_fact] *********************************
fatal: [10.250.198.160]: FAILED! => {}

MSG:

The task includes an option with an undefined variable. The error was: 'ansible.utils.unsafe_proxy.AnsibleUnsafeText object' has no attribute 'percent_complete'

The error appears to be in '/home/u630850/facts/netapp_volume_move.yml': line 48, column 19, but may
be elsewhere in the file depending on the exact syntax problem.

The offending line appears to be:

      - debug: var=variable_data
                  ^ here

There appears to be both 'k=v' shorthand syntax and YAML in this task. Only one syntax may be used.

Playbook as previously mentioned:
     ---
- debug: var=variable_data   # output mentioned below
      - set_fact:      

          data_done: "{{ variable_data| selectattr('percent_complete', 'ne', 100)|list| length == 0}}"

Vladimir Botka

unread,
Jun 21, 2023, 6:39:57 PM6/21/23
to Aharonu, ansible...@googlegroups.com
On Wed, 21 Jun 2023 19:34:21 +0000
Aharonu <ahar...@gmail.com> wrote:

> - name: run ontap rest cli command to check volume move status
> netapp.ontap.na_ontap_rest_cli:
> <<: *login
> hostname: "{{ inventory_hostname }}"
> command: 'volume/move?fields=percent-complete,state'
> params:
> volume: '*'
> vserver: '*'
> verb: 'GET'
> register: variable_data
> ...
>
> TASK [debug] ****************
> ok: [cluster1] => {
> "variable_data": {
> "changed": true,
> "failed": false,
> "msg": {
> "num_records": 3,
> "records": [
> {
> "percent_complete": 30,
> "state": "done",
> "volume": "vol1",
> "vserver": "svm1"
> },
> {
> "percent_complete": 98,
> "state": "healthy",
> "volume": "vol2",
> "vserver": "svm1"
> },
> {
> "percent_complete": 50,
> "state": "done",
> "volume": "vol1",
> "vserver": "svm2"
> }
> ]
> }
> }
> }

Try the tasks below. Register the *variable_data* in the first task
and use it in the starting *until* condition of the second task. You
want to repeat it every 1 minute. Set *delay* 60 seconds and *retry*
for example 60 times (1 hour)

- netapp.ontap.na_ontap_rest_cli:
<<: *login
hostname: "{{ inventory_hostname }}"
command: 'volume/move?fields=percent-complete,state'
params:
volume: '*'
vserver: '*'
verb: 'GET'
register: variable_data

- netapp.ontap.na_ontap_rest_cli:
<<: *login
hostname: "{{ inventory_hostname }}"
command: 'volume/move?fields=percent-complete,state'
params:
volume: '*'
vserver: '*'
verb: 'GET'
register: variable_data
until: variable_data.msg.records|
selectattr('percent_complete', 'ne', 100)|
list|length == 0
delay: 60
retry: 60

This way you are actually monitoring the volumes. But, Ansible can't
provide you with any intermediate data. The task is running on the
remote host and you'll see the results once the task completes and
returns the results to the controller.

Try to configure NetApp system monitors if you want to see any
progress . See
https://docs.netapp.com/us-en/cloudinsights/task_system_monitors.html#monitor-descriptions


--
Vladimir Botka

Aharonu

unread,
Jun 22, 2023, 6:56:04 AM6/22/23
to Vladimir Botka, ansible...@googlegroups.com
HI Vladimir,

Thank you again.

I think that is what I am looking for but it is when running with verbosity ( -vvvv).

Could you please go through below mentioned -vvv output where i can see required output under MSG ( highlighted in blue):

Can we get output of only MSG instead of listing all other output so that i can seethe required output?


TASK [run ontap rest cli command for vol move resister variabele] ***************************************************************************************************************************
task path: /home/user1/facts/netapp_
volume_move.yml:38
< removed some data >
    "attempts": 1,
    "changed": true,
    "invocation": {
        "module_args": {
            "body": {},
            "cert_filepath": null,
            "command": "volume/move?fields=percent-
complete,state",
            "feature_flags": {},
            "hostname": "Cluster1",
            "http_port": null,
            "https": true,
            "key_filepath": null,
            "ontapi": null,
            "params": {
                "volume": "*",
                "vserver": "*"
            },
            "password": "VALUE_SPECIFIED_IN_NO_LOG_
PARAMETER",
            "use_rest": "Always",
            "username": "user1",
            "validate_certs": false,
            "verb": "GET"
        }
    },
    "retries": 36
}

MSG:

{'records': [{'vserver': 'svm1', 'volume': 'svm1_mp_v3', 'state': 'done', 'percent_complete': 100}, {'vserver': 'svm2', 'volume': 'user1_test_clone2', 'state': 'done', 'percent_complete': 100}, {'vserver': 'svm2', 'volume': 'user1_test_clone3', 'state': 'healthy', 'percent_complete': 98}], 'num_records': 3}
< removed some data >
FAILED - RETRYING: run ontap rest cli command for vol move resister variabele (34 retries left).Result was: {
    "attempts": 2,
    "changed": true,
    "invocation": {
        "module_args": {
            "body": {},
            "cert_filepath": null,
            "command": "volume/move?fields=percent-
complete,state",
            "feature_flags": {},
            "hostname": "Cluster1",
            "http_port": null,
            "https": true,
            "key_filepath": null,
            "ontapi": null,
            "params": {
                "volume": "*",
                "vserver": "*"
            },
            "password": "VALUE_SPECIFIED_IN_NO_LOG_
PARAMETER",
            "use_rest": "Always",
            "username": "user1",
            "validate_certs": false,
            "verb": "GET"
        }
    },
    "retries": 36
}

MSG:

{'records': [{'vserver': 'svm1', 'volume': 'svm1_mp_v3', 'state': 'done', 'percent_complete': 100}, {'vserver': 'svm2', 'volume': 'user1_test_clone2', 'state': 'done', 'percent_complete': 100}, {'vserver': 'svm2', 'volume': 'user1_test_clone3', 'state': 'healthy', 'percent_complete': 98}], 'num_records': 3}

< removed some data >
FAILED - RETRYING: run ontap rest cli command for vol move resister variabele (33 retries left).Result was: {
    "attempts": 3,
    "changed": true,
    "invocation": {
        "module_args": {
            "body": {},
            "cert_filepath": null,
            "command": "volume/move?fields=percent-
complete,state",
            "feature_flags": {},
            "hostname": "Cluster1",
            "http_port": null,
            "https": true,
            "key_filepath": null,
            "ontapi": null,
            "params": {
                "volume": "*",
                "vserver": "*"
            },
            "password": "VALUE_SPECIFIED_IN_NO_LOG_
PARAMETER",
            "use_rest": "Always",
            "username": "user1",
            "validate_certs": false,
            "verb": "GET"
        }
    },
    "retries": 36
}

MSG:

{'records': [{'vserver': 'svm1', 'volume': 'svm1_mp_v3', 'state': 'done', 'percent_complete': 100}, {'vserver': 'svm2', 'volume': 'user1_test_clone2', 'state': 'done', 'percent_complete': 100}, {'vserver': 'svm2', 'volume': 'user1_test_clone3', 'state': 'healthy', 'percent_complete': 98}], 'num_records': 3}

< removed some data >
FAILED - RETRYING: run ontap rest cli command for vol move resister variabele (32 retries left).Result was: {
    "attempts": 4,
    "changed": true,
    "invocation": {
        "module_args": {
            "body": {},
            "cert_filepath": null,
            "command": "volume/move?fields=percent-
complete,state",
            "feature_flags": {},
            "hostname": "Cluster1",
            "http_port": null,
            "https": true,
            "key_filepath": null,
            "ontapi": null,
            "params": {
                "volume": "*",
                "vserver": "*"
            },
            "password": "VALUE_SPECIFIED_IN_NO_LOG_
PARAMETER",
            "use_rest": "Always",
            "username": "user1",
            "validate_certs": false,
            "verb": "GET"
        }
    },
    "retries": 36
}

MSG:

{'records': [{'vserver': 'svm1', 'volume': 'svm1_mp_v3', 'state': 'done', 'percent_complete': 100}, {'vserver': 'svm2', 'volume': 'user1_test_clone2', 'state': 'done', 'percent_complete': 100}, {'vserver': 'svm2', 'volume': 'user1_test_clone3', 'state': 'healthy', 'percent_complete': 98}], 'num_records': 3}

< removed some data >
changed: [Cluster1] => {
    "attempts": 5,
    "changed": true,
    "invocation": {
        "module_args": {
            "body": {},
            "cert_filepath": null,
            "command": "volume/move?fields=percent-
complete,state",
            "feature_flags": {},
            "hostname": "Cluster1",
            "http_port": null,
            "https": true,
            "key_filepath": null,
            "ontapi": null,
            "params": {
                "volume": "*",
                "vserver": "*"
            },
            "password": "VALUE_SPECIFIED_IN_NO_LOG_
PARAMETER",
            "use_rest": "Always",
            "username": "user1",
            "validate_certs": false,
            "verb": "GET"
        }
    }
}

MSG:

{'records': [{'vserver': 'svm1', 'volume': 'svm1_mp_v3', 'state': 'done', 'percent_complete': 100}, {'vserver': 'svm2', 'volume': 'user1_test_clone2', 'state': 'done', 'percent_complete': 100}, {'vserver': 'svm2', 'volume': 'user1_test_clone3', 'state': 'done', 'percent_complete': 100}], 'num_records': 3}

Read vars_file '/home/user1/facts/cred_file_
Privileged_account.yml'



Vladimir Botka

unread,
Jun 22, 2023, 12:04:29 PM6/22/23
to Aharonu, ansible...@googlegroups.com
On Thu, 22 Jun 2023 10:55:32 +0000
Aharonu <ahar...@gmail.com> wrote:

> ...
> Can we get output of only MSG instead of listing all other output so that i
> can see the required output?

This is the list of the records *variable_data.msg.records*

I can't provide you with better answer without "Minimal reproducible
example". https://en.wikipedia.org/wiki/Minimal_reproducible_example

--
Vladimir Botka

Aharonu

unread,
Jun 22, 2023, 11:19:16 PM6/22/23
to ansible...@googlegroups.com, Vladimir Botka


Hi Team,

Greetings!

Could you please go through below mentioned -vvv output where i can see required output under MSG ( highlighted in blue):

Can we get output of only MSG instead of listing all other output so that i can seethe required output? 

Thank you for your time and help.

========
playbook
=========
Below is the the playbook and also mentioned output of  TASK [run ontap rest cli command for vol move resister variabele]

ansible-playbook netapp_volume_move.yml

---
- hosts: all
  connection: local
  gather_facts: false
  collections:
    - netapp.ontap
  vars_files:

  vars:
    login: &login
      username: "{{ admin_username }}"
      password: "{{ admin_password }}"
      https: true
      validate_certs: false
      use_rest: Always
  #   feature_flags:
  #     trace_apis: true

  tasks:
  - name: block section task
    block:
      - name: Volum move initiated
        na_ontap_volume:
          <<: *login
          aggregate_name: ssd_a1_04
          vserver: "{{ vserver }}"
          name: "{{ name }}"

          hostname: "{{ inventory_hostname }}"
        register: ontap_volume
        no_log: false
      - name: run ontap rest cli command for vol move resister variabele
        netapp.ontap.na_ontap_rest_cli:
          <<: *login
          hostname: "{{ inventory_hostname }}"
          command: 'volume/move?fields=percent-complete,state,destination-aggregate'

          params:
            volume: '*'
            vserver: '*'
            destination-aggregate: '*'

          verb: 'GET'
        register: variable_data
        until: variable_data.msg.records|selectattr('percent_complete', 'ne', 100)|list| length == 0
        retries: 35
        delay: 30
      - name: Run debug task
        debug:
          var=variable_data.msg.records
(python3_venv) $

====================
result:
=====================
(python3_venv) $ ansible-playbook netapp_volume_move.yml -vvv
[DEPRECATION WARNING]: Ansible will require Python 3.8 or newer on the controller starting with Ansible 2.12. Current version: 3.6.8 (default, Aug 13 2020, 07:46:32) [GCC 4.8.5 20150623
(Red Hat 4.8.5-39)]. This feature will be removed from ansible-core in version 2.12. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg.
ansible-playbook [core 2.11.12]
  config file = /home/user1/facts/ansible.cfg
  configured module search path = ['/home/user1/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /apps/python3_venv/lib/python3.6/site-packages/ansible
  ansible collection location = /home/user1/.ansible/collections:/usr/share/ansible/collections
  executable location = /apps/python3_venv/bin/ansible-playbook
  python version = 3.6.8 (default, Aug 13 2020, 07:46:32) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
  jinja version = 3.0.2
  libyaml = True
Using /home/user1/facts/ansible.cfg as config file
host_list declined parsing /home/user1/facts/inventory/DEV/hosts.yml as it did not pass its verify_file() method
script declined parsing /home/user1/facts/inventory/DEV/hosts.yml as it did not pass its verify_file() method
Parsed /home/user1/facts/inventory/DEV/hosts.yml inventory source with yaml plugin
redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug
redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.

PLAYBOOK: netapp_volume_move.yml ************************************************************************************************************************************************************
1 plays in netapp_volume_move.yml
Read vars_file '/home/user1/facts/cred_file_Privileged_account.yml'
Read vars_file '/home/user1/facts/cred_file_Privileged_account.yml'
Read vars_file '/home/user1/facts/cred_file_Privileged_account.yml'

PLAY [all] **********************************************************************************************************************************************************************************
Read vars_file '/home/user1/facts/cred_file_Privileged_account.yml'
META: ran handlers
Read vars_file '/home/user1/facts/cred_file_Privileged_account.yml'

TASK [Volum move initiated] *****************************************************************************************************************************************************************
task path: /home/user1/facts/netapp_volume_move.yml:24
<Cluster1> ESTABLISH LOCAL CONNECTION FOR USER: user1
<Cluster1> EXEC /bin/sh -c 'echo ~user1 && sleep 0'
<Cluster1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/user1/.ansible/tmp `"&& mkdir "` echo /home/user1/.ansible/tmp/ansible-tmp-1687409979.04913-10950-22668945714167 `" && echo ansible-tmp-1687409979.04913-10950-22668945714167="` echo /home/user1/.ansible/tmp/ansible-tmp-1687409979.04913-10950-22668945714167 `" ) && sleep 0'
<Cluster1> Attempting python interpreter discovery
<Cluster1> EXEC /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'/usr/bin/python'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'python3.6'"'"'; command -v '"'"'python3.5'"'"'; command -v '"'"'python2.7'"'"'; command -v '"'"'python2.6'"'"'; command -v '"'"'/usr/libexec/platform-python'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python'"'"'; echo ENDFOUND && sleep 0'
<Cluster1> EXEC /bin/sh -c '/usr/bin/python && sleep 0'
Using module file /apps/python3_venv/lib/python3.6/site-packages/ansible_collections/netapp/ontap/plugins/modules/na_ontap_volume.py
<Cluster1> PUT /home/user1/.ansible/tmp/ansible-local-10941hhll8xxo/tmpyk5tyq1j TO /home/user1/.ansible/tmp/ansible-tmp-1687409979.04913-10950-22668945714167/AnsiballZ_na_ontap_volume.py
<Cluster1> EXEC /bin/sh -c 'chmod u+x /home/user1/.ansible/tmp/ansible-tmp-1687409979.04913-10950-22668945714167/ /home/user1/.ansible/tmp/ansible-tmp-1687409979.04913-10950-22668945714167/AnsiballZ_na_ontap_volume.py && sleep 0'
<Cluster1> EXEC /bin/sh -c '/usr/bin/python /home/user1/.ansible/tmp/ansible-tmp-1687409979.04913-10950-22668945714167/AnsiballZ_na_ontap_volume.py && sleep 0'
<Cluster1> EXEC /bin/sh -c 'rm -f -r /home/user1/.ansible/tmp/ansible-tmp-1687409979.04913-10950-22668945714167/ > /dev/null 2>&1 && sleep 0'
changed: [Cluster1] => {
    "ansible_facts": {
        "discovered_interpreter_python": "/usr/bin/python"
    },
    "changed": true,
    "invocation": {
        "module_args": {
            "aggr_list": null,
            "aggr_list_multiplier": null,
            "aggregate_name": "ssd_a1_04",
            "atime_update": null,
            "auto_provision_as": null,
            "auto_remap_luns": null,
            "cert_filepath": null,
            "check_interval": 60,
            "comment": null,
            "compression": null,
            "cutover_action": null,
            "efficiency_policy": null,
            "encrypt": null,
            "export_policy": null,
            "feature_flags": {},
            "force_restore": null,
            "force_unmap_luns": null,
            "from_name": null,
            "from_vserver": null,
            "group_id": null,

            "hostname": "Cluster1",
            "http_port": null,
            "https": true,
            "inline_compression": null,
            "is_infinite": false,
            "is_online": true,
            "junction_path": null,
            "key_filepath": null,
            "language": null,
            "name": "user1_test_clone3",
            "nas_application_template": null,
            "nvfail_enabled": null,
            "ontapi": null,
            "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
            "percent_snapshot_space": null,
            "preserve_lun_ids": null,
            "qos_adaptive_policy_group": null,
            "qos_policy_group": null,
            "size": null,
            "size_change_threshold": 10,
            "size_unit": "gb",
            "sizing_method": null,
            "snapdir_access": null,
            "snapshot_auto_delete": null,
            "snapshot_policy": null,
            "snapshot_restore": null,
            "space_guarantee": null,
            "space_slo": null,
            "state": "present",
            "tiering_policy": null,
            "time_out": 180,
            "type": null,
            "unix_permissions": null,
            "use_rest": "Always",
            "user_id": null,

            "username": "user1",
            "validate_certs": false,
            "volume_security_style": null,
            "vserver": "svm2",
            "vserver_dr_protection": null,
            "wait_for_completion": true
        }
    },
    "modify": {
        "aggregate_name": "ssd_a1_04"
    }
}
Read vars_file '/home/user1/facts/cred_file_Privileged_account.yml'


TASK [run ontap rest cli command for vol move resister variabele] ***************************************************************************************************************************
task path: /home/user1/facts/netapp_volume_move.yml:38
<Cluster1> ESTABLISH LOCAL CONNECTION FOR USER: user1
<Cluster1> EXEC /bin/sh -c 'echo ~user1 && sleep 0'
<Cluster1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/user1/.ansible/tmp `"&& mkdir "` echo /home/user1/.ansible/tmp/ansible-tmp-1687409981.3215845-10986-170223569972133 `" && echo ansible-tmp-1687409981.3215845-10986-170223569972133="` echo /home/user1/.ansible/tmp/ansible-tmp-1687409981.3215845-10986-170223569972133 `" ) && sleep 0'
Using module file /apps/python3_venv/lib/python3.6/site-packages/ansible_collections/netapp/ontap/plugins/modules/na_ontap_rest_cli.py
<Cluster1> PUT /home/user1/.ansible/tmp/ansible-local-10941hhll8xxo/tmp0xvrlu40 TO /home/user1/.ansible/tmp/ansible-tmp-1687409981.3215845-10986-170223569972133/AnsiballZ_na_ontap_rest_cli.py
<Cluster1> EXEC /bin/sh -c 'chmod u+x /home/user1/.ansible/tmp/ansible-tmp-1687409981.3215845-10986-170223569972133/ /home/user1/.ansible/tmp/ansible-tmp-1687409981.3215845-10986-170223569972133/AnsiballZ_na_ontap_rest_cli.py && sleep 0'
<Cluster1> EXEC /bin/sh -c '/usr/bin/python /home/user1/.ansible/tmp/ansible-tmp-1687409981.3215845-10986-170223569972133/AnsiballZ_na_ontap_rest_cli.py && sleep 0'
<Cluster1> EXEC /bin/sh -c 'rm -f -r /home/user1/.ansible/tmp/ansible-tmp-1687409981.3215845-10986-170223569972133/ > /dev/null 2>&1 && sleep 0'
FAILED - RETRYING: run ontap rest cli command for vol move resister variabele (35 retries left).Result was: {
<Cluster1> EXEC /bin/sh -c 'echo ~user1 && sleep 0'
<Cluster1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/user1/.ansible/tmp `"&& mkdir "` echo /home/user1/.ansible/tmp/ansible-tmp-1687410012.5154507-10986-194750717210998 `" && echo ansible-tmp-1687410012.5154507-10986-194750717210998="` echo /home/user1/.ansible/tmp/ansible-tmp-1687410012.5154507-10986-194750717210998 `" ) && sleep 0'
Using module file /apps/python3_venv/lib/python3.6/site-packages/ansible_collections/netapp/ontap/plugins/modules/na_ontap_rest_cli.py
<Cluster1> PUT /home/user1/.ansible/tmp/ansible-local-10941hhll8xxo/tmp62va_1q_ TO /home/user1/.ansible/tmp/ansible-tmp-1687410012.5154507-10986-194750717210998/AnsiballZ_na_ontap_rest_cli.py
<Cluster1> EXEC /bin/sh -c 'chmod u+x /home/user1/.ansible/tmp/ansible-tmp-1687410012.5154507-10986-194750717210998/ /home/user1/.ansible/tmp/ansible-tmp-1687410012.5154507-10986-194750717210998/AnsiballZ_na_ontap_rest_cli.py && sleep 0'
<Cluster1> EXEC /bin/sh -c '/usr/bin/python /home/user1/.ansible/tmp/ansible-tmp-1687410012.5154507-10986-194750717210998/AnsiballZ_na_ontap_rest_cli.py && sleep 0'
<Cluster1> EXEC /bin/sh -c 'rm -f -r /home/user1/.ansible/tmp/ansible-tmp-1687410012.5154507-10986-194750717210998/ > /dev/null 2>&1 && sleep 0'
<Cluster1> EXEC /bin/sh -c 'echo ~user1 && sleep 0'
<Cluster1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/user1/.ansible/tmp `"&& mkdir "` echo /home/user1/.ansible/tmp/ansible-tmp-1687410043.258642-10986-9337563152068 `" && echo ansible-tmp-1687410043.258642-10986-9337563152068="` echo /home/user1/.ansible/tmp/ansible-tmp-1687410043.258642-10986-9337563152068 `" ) && sleep 0'
Using module file /apps/python3_venv/lib/python3.6/site-packages/ansible_collections/netapp/ontap/plugins/modules/na_ontap_rest_cli.py
<Cluster1> PUT /home/user1/.ansible/tmp/ansible-local-10941hhll8xxo/tmp1d45ynly TO /home/user1/.ansible/tmp/ansible-tmp-1687410043.258642-10986-9337563152068/AnsiballZ_na_ontap_rest_cli.py
<Cluster1> EXEC /bin/sh -c 'chmod u+x /home/user1/.ansible/tmp/ansible-tmp-1687410043.258642-10986-9337563152068/ /home/user1/.ansible/tmp/ansible-tmp-1687410043.258642-10986-9337563152068/AnsiballZ_na_ontap_rest_cli.py && sleep 0'
<Cluster1> EXEC /bin/sh -c '/usr/bin/python /home/user1/.ansible/tmp/ansible-tmp-1687410043.258642-10986-9337563152068/AnsiballZ_na_ontap_rest_cli.py && sleep 0'
<Cluster1> EXEC /bin/sh -c 'rm -f -r /home/user1/.ansible/tmp/ansible-tmp-1687410043.258642-10986-9337563152068/ > /dev/null 2>&1 && sleep 0'
<Cluster1> EXEC /bin/sh -c 'echo ~user1 && sleep 0'
<Cluster1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/user1/.ansible/tmp `"&& mkdir "` echo /home/user1/.ansible/tmp/ansible-tmp-1687410074.0302222-10986-29882595310304 `" && echo ansible-tmp-1687410074.0302222-10986-29882595310304="` echo /home/user1/.ansible/tmp/ansible-tmp-1687410074.0302222-10986-29882595310304 `" ) && sleep 0'
Using module file /apps/python3_venv/lib/python3.6/site-packages/ansible_collections/netapp/ontap/plugins/modules/na_ontap_rest_cli.py
<Cluster1> PUT /home/user1/.ansible/tmp/ansible-local-10941hhll8xxo/tmpr6quwepc TO /home/user1/.ansible/tmp/ansible-tmp-1687410074.0302222-10986-29882595310304/AnsiballZ_na_ontap_rest_cli.py
<Cluster1> EXEC /bin/sh -c 'chmod u+x /home/user1/.ansible/tmp/ansible-tmp-1687410074.0302222-10986-29882595310304/ /home/user1/.ansible/tmp/ansible-tmp-1687410074.0302222-10986-29882595310304/AnsiballZ_na_ontap_rest_cli.py && sleep 0'
<Cluster1> EXEC /bin/sh -c '/usr/bin/python /home/user1/.ansible/tmp/ansible-tmp-1687410074.0302222-10986-29882595310304/AnsiballZ_na_ontap_rest_cli.py && sleep 0'
<Cluster1> EXEC /bin/sh -c 'rm -f -r /home/user1/.ansible/tmp/ansible-tmp-1687410074.0302222-10986-29882595310304/ > /dev/null 2>&1 && sleep 0'
<Cluster1> EXEC /bin/sh -c 'echo ~user1 && sleep 0'
<Cluster1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/user1/.ansible/tmp `"&& mkdir "` echo /home/user1/.ansible/tmp/ansible-tmp-1687410104.7785237-10986-246365905929535 `" && echo ansible-tmp-1687410104.7785237-10986-246365905929535="` echo /home/user1/.ansible/tmp/ansible-tmp-1687410104.7785237-10986-246365905929535 `" ) && sleep 0'
Using module file /apps/python3_venv/lib/python3.6/site-packages/ansible_collections/netapp/ontap/plugins/modules/na_ontap_rest_cli.py
<Cluster1> PUT /home/user1/.ansible/tmp/ansible-local-10941hhll8xxo/tmpy8vvt7gd TO /home/user1/.ansible/tmp/ansible-tmp-1687410104.7785237-10986-246365905929535/AnsiballZ_na_ontap_rest_cli.py
<Cluster1> EXEC /bin/sh -c 'chmod u+x /home/user1/.ansible/tmp/ansible-tmp-1687410104.7785237-10986-246365905929535/ /home/user1/.ansible/tmp/ansible-tmp-1687410104.7785237-10986-246365905929535/AnsiballZ_na_ontap_rest_cli.py && sleep 0'
<Cluster1> EXEC /bin/sh -c '/usr/bin/python /home/user1/.ansible/tmp/ansible-tmp-1687410104.7785237-10986-246365905929535/AnsiballZ_na_ontap_rest_cli.py && sleep 0'
<Cluster1> EXEC /bin/sh -c 'rm -f -r /home/user1/.ansible/tmp/ansible-tmp-1687410104.7785237-10986-246365905929535/ > /dev/null 2>&1 && sleep 0'
Reply all
Reply to author
Forward
0 new messages