Configuring an HP GbE2c switch with SCP.

240 views
Skip to first unread message

Chris

unread,
Sep 8, 2013, 9:41:21 PM9/8/13
to ansible...@googlegroups.com
Hi there,

I'm brand new to Ansible and I would like to know how you would approach the following situation: I'm trying to push some configuration files to HP GbE2c L2/L3 switches that I maintain. I can access the switches using SSH but need to enter a password manually (no way to declare an authorized RSA key). From there I have access to a CISCO-like CLI to manage the configuration. I can also push a configuration file remotely using SCP, with a syntax like the following, for instance from a Linux host:

> scp config.cfg admin@switch_ip:putcfg_apply_save
Password:

which will upload file "config.cfg" to the switch, apply the new configuration and flash it to persistent memory.

I would like to do this automatically with Ansible as I have a fairly large number of switches to manage (about 20). There are two things I need for that:
1 - A way to enter the SSH admin password automatically, like with Tcl/Tk "expect".
2 - A way to carry an scp command locally as obviously I can't execute any command on the remote system.

Up to now I have come with the following solution:

-
  hosts: switches
  gather_facts: no

  tasks:
    - name: Push configuration to the switches
      local_action: command scp ${inventory_hostname}.cfg
admin@${inventory_hostname}:putcfg_apply_save

which has the following limitations:

1 - I need to enter the switch password manually, which pretty much defeats the purpose of automation.
2 - Although the config file is properly transfered, the command end up with a "fail" result, as can be seen from this output:

# ansible-playbook -vvv test.yml

PLAY [switch_name] **********************************************************

TASK: [Get configuration from the switches] ***********************************
<127.0.0.1> EXEC ['/bin/sh', '-c', 'mkdir -p $HOME/.ansible/tmp/ansible-1378690317.33-53871091023451 && echo $HOME/.ansible/tmp/ansible-1378690317.33-53871091023451']
<127.0.0.1> REMOTE_MODULE command scp switch_name.cfg admin@switch_name-ilo:putcfg
<127.0.0.1> PUT /tmp/tmpQvdMMI TO /root/.ansible/tmp/ansible-1378690317.33-53871091023451/command
<127.0.0.1> EXEC ['/bin/sh', '-c', '/usr/bin/python /root/.ansible/tmp/ansible-1378690317.33-53871091023451/command; rm -rf /root/.ansible/tmp/ansible-1378690317.33-53871091023451/ >/dev/null 2>&1']
Enter password:
failed: [switch_name] => {"changed": true, "cmd": ["scp", "switch_name.cfg", "admin@switch_name-ilo:putcfg"], "delta": "0:00:07.253088", "end": "2013-09-09 01:32:04.652952", "rc": 1, "start": "2013-09-09 01:31:57.399864"}
stderr:
Switch: executing scp command - putcfg.
Received disconnect from 10.10.10.1: 11: Logged out.
lost connection

FATAL: all hosts have already failed -- aborting

PLAY RECAP ********************************************************************
           to retry, use: --limit @/root/test.retry

switch_name              : ok=0    changed=0    unreachable=0    failed=1

So my question is: how would you approach this situation with Ansible? Is there a better way?

Thanks for the help,
Chris

Brian Coca

unread,
Sep 8, 2013, 10:04:30 PM9/8/13
to ansible...@googlegroups.com
"ansible-playbook -k -vvv test.yml" will ask you for the ssh password and send it to each host but prompt you only once

as for the commands, try the following:


- hosts: switches
  gather_facts: no
  user: admin
  tasks:
 - name: Push configuration to the switches
      local_action: command scp ${inventory_hostname}.cfg 

 - name: save config
    raw: putcfg_apply_save
Reply all
Reply to author
Forward
0 new messages