Ansible

Ad-hoc with pipe

Source

Running Ansible Ad-hoc commands makes trouble when you try to use redirections and piping.

Use the following syntax instead.

ansible -i <inventory> all -a "bash -c 'dpkg -l | grep python-apt'"

The documentation states this quite clearly by the way:

The given command will be executed on all selected nodes. It will not be processed through the shell, so variables like $HOME and operations like <, >, |, ; and & will not work (use the shell module if you need these features).

Source

This is valid for the command module as well as when running ad-hoc commands.

Hostname splitting

Source

Ansible/Jinja itself does not support splitting up strings. Use python for that:

---
- hosts: all
  tasks:
    - name: DEBUGGING
      debug:
        msg: "{{ inventory_hostname.split('.')[0] }}"

Disable Host Key Checking

Source

Disable the Strict Host Name Checking when in trouble:

export ANSIBLE_HOST_NAME_CHECKING=False

In .ansible.cfg:

[defaults]
host_key_checking = False

This can also be automated in a playbook by itself:

Source

---
- hosts: all
  gather_facts: no
  become: no
  tasks:
    - name: GET PORT, DEFAULT 22
      delegate_to: localhost
      set_fact:
        ansible_ssh_port: "{{ hostvars[inventory_hostname]['ansible_ssh_port'] | default('22') }}"
    - name: ENSURE SSH HOST KEY KNOWN
      delegate_to: localhost
      lineinfile:
        dest: ~/.ssh/known_hosts
        create: yes
        state: present
        line: "{{ lookup('pipe', 'ssh-keyscan -trsa -p' + ansible_ssh_port + ' ' + ansible_ssh_host) }}"

Scripting

Run Ansible playbooks as scripts:

#!/usr/bin/ansible-playbook
- hosts: 127.0.0.1
  connection: local
  tasks:
    - name: DO THE HELLO WORLD
      shell: echo Hello World!

Debugging

Source

Keyword: Debugger

Values:

always|never|on_failed|on_unreachable|on_skipped

Debugging on a task

- name: Execute a commdn
  command: false
  debugger: on_failed

Debugging on a playbook

- name: Play
  hosts: all
  debugger: on_skipped
  tasks:
    - name: Execute a command
      command: true
      when: false

Debugging as strategy

Variant 1:

- hosts: test
  strategy: debug
  tasks:
  ...

Variant 2:

ANSIBLE_STRATEGY=debug

Commands

  • p(print) task|task_vars|task.args|host|result: print output
  • r(redo) : Run the task again
  • c(continue): Just continue
  • q(quit): Just quit from the debugger. Playbook execution is aborted.

Variable inspection

Source

Print multiple variables

- name: "Ansible | print multiple variables"
  debug:
    msg: |
      The role is {{ host_role }}
      The environment is "{{ host_environment }}

List all known variables and facts

- name: "Ansible | List all known variables and facts"
  debug:
    var: hostvars[inventory_hostname]

Serial

Source

Serial will signify an execution of Ansible on a number of nodes in paralell form with a specific number.

EG: serial: 1 will cause the playbook to go through the hosts 1 by 1.

Put a number of serials in a row:

- name: SERIAL PLAYBOOK
  hosts: all
  become: True
  serial:
    - '10%'
    - '50%'
    - '100%'
  tasks:
    name: 'DEBUG MESSAGE'
    debug:
      msg: 'Test Serial'

The execution will be executed in the 10 percent of hosts in the inventory, then in another 50 percent and then in the remaining 100 percent.


Best Practice

Example playbook execution

  • Put a comment at the top of the playbook to demonstrate the execution of the playbook
---
# Description
#   Some overall description on what this playbook does.
#
# Parameters
#   - sample_folder type str
#     Some more explanation.
#
# Sample execution
#   ansible-playbook playbook.yml --extra-vars sample_folder=test_folder -i hosts
#
- name: 'Playbook example'
  hosts: all
  vars:
    - default_work_folder: '/tmp'
  tasks:
    - name: 'Create TMP FOLDER'
      file:
        state: directory
        path: "{{ default_work_folder}}/{{ sample_folder }}"

Validation role/playbook

Secure the playbook execution with an additiona layer on every Role, which can be called the validation layer.

This ensures:

  • Loaded the correct key/value
  • Variable being loaded are related to the environment.
  • Variables make sense between them.
  • Any other stuff preventing a failed execution.
---
### Variable validation
- hosts: all
  pre_tasks:
    - name: [VALIDATION] ENTERING VALIDATION STAGE
      debug:
        msg: "ENTERING ON VALIDATION STATE WITH MODE: {{ service }}"
        verbosity: 1

    - name: '[VALIDATION] CHECKING ANSIBLE VARIABLES FOR THIS ROLE.'
      fail:
        msg: "Check the undefined variables: {{ item }}"
      when:
        - item is undefined
          or item is none
      with_items:
        - "{{ VAR1 }}"
        - "{{ VAR2 }}"
        - "{{ VAR3 }}"

    - name: '[VALIDATION] CHECKING VARIABLES AGAINST OTHER STUFF'
      fail:
        msg: " Error on variables, check main $foo values:
          - VAR1: {{ VAR1 }}
          - VAR2: {{ VAR2 }}
          - VAR3: {{ VAR3 }}
          - {{ RELEASE }}
        "
      when:
        - VAR1 != VAR2
          or VAR2.VERSION != RELEASE

Fact vs. Variable

Affects: Ansible Tower

In a scenario, where a variable needs to be set as a fact in Ansible/Ansible Tower, identical names for variables and facts should be avoided.

In the given situation the inventory was setting a variable on a host. In order to use smart inventory filters in Ansible Tower, the value needed to be available as fact as well. When choosing the same name for the fact as for the variable, some sideeffects where visible:

  • When updating/changing the variable, updating the already existing fact failed. This was due to the concept of facts and the overwrite-rules within ansible. However, overwriting the variable didn't work. It was not clear if the value should be derived from the existing fact with the identical value or from the variable with the updated value.
  • Assigning a fact and not caching it didn't not lead to the desired outcome. The fact was still cached.

Solution: Use different names for variables and facts where possible.


Custom facts

Source

  • At the beginning of every Ansible run the Setup task is executed. This task gathers facts from the host.
  • Custom facts can be easily added and the setup task be extended:

    1. Deploy a custom script in /etc/ansible/facts.d: The return value of the script must be valid jason data. The script itself can be any script language that can be used on the host. If the script is not executeable on the host but only contains json data, the values are simply read.

    bash #!/user/bin/env bash # file: /etc/ansible/facts.d/date.fact DATE=`date` echo "{\"date\" : \"${DATE}\"}"

    1. The facts are available within Ansible at hostvars.host.ansible_local.* The asterisk (*) is replaced by the filename-prefix used in the fact file.

Setup

- name: ENSURE CUSTOM FACT DIRECTORY
  file:
    path: '/etc/ansible/facts.d'
    state: 'directory'

- name: ENSURE CUSTOM FACT FILE
  copy:
    src: files/custom.fact
    dest: /etc/ansible/facts.d/custom.fact
    mode: 0750

Re-run the setup task

- name: 'RE-RUN SETUP TO USE CUSTOM FACTS'
  setup: ~

Set facts

  • Facts that have been set via a playbook and be stored in the fact cache cannot simply being overwritten by defining the same fact again and assigning a new value. This is especially true when variable shall overwrite a fact set by set_fact.
  • Variables have a lower priority as facts, therefore a variable will not overwrite or update any existing fact. The solution here is to transform the variable into a temporary fact and re-assign that one.
- hosts: all
  tasks:
     - name: 'SET TEMPORARY FACT'
       set_fact:
         maintenance_tmp: 'yes'
         cacheable: False
     - name: 'SET PERMANENT FACT'
       set_fact:
         maintenance: "{{ maintenance_tmp }}
         cacheable: True

Ansible Tower/AWX

Surveys

Source

The syntax for the tower module tower_job_template parameter survey_spec is kind of hidden and currently not well documentet.

survey_spec: '{"spec": [{"index": 0, "question_name": "my question?", "default": "mydef", "variable": "myvar", "type": "text", "required": "false"}], "description": "test", "name": "test"}'

Example:

survey_enabled: yes
survey_spec: '{"spec": [{"index": 0, "question_name": "SELECT VM STATUS", "question_description": "Set the VMs to the selected status", "required": true, "type": "multiplechoice", "variable": "host_action", "min": null, "max": null, "default": "nothing", "choices": "restart\nshutdown\npower-off\npower-on\nnothing", "new_question": true}], "description": "", "name": ""}'

Remove all hosts

# Remove all hosts stored in Ansible Tower
$ tower-cli host list -f id -a | xargs -n 1 tower-cli host delete

Query activity on the host

The tower interface doesn't give you access to all information available. Swith to the tower-cli command instead.

# List all activity
$ tower-cli activity_stream list

# List all activity by user john doe
$ tower-cli user list # Note the userid
== ============== ...
 1 johndoe
.. .....
== ============== ...
$ tower-cli activity_stream list -a --query actor 1

# List all schedules created by user 1, output json
$ tower-cli user list # Note the userid
== ============== ...
 1 johndoe
.. .....
== ============== ...
$ tower-cli activity_stream list -a \
  --query object1 schedule \
  --query operation create \
  --query actor 1 \
  -f json

Conditions

When

Source

Matching patterns:

The following condition will limit the execution of a task to all hosts in the current play whose name matches 'foobar'. The regex equivalent would be 'foobar'.

- hosts: all
  tasks:
    - name: RUNNING ON CONDITION
      debug:
        msg: "task executed"
      when: "{{ inventory_host }}is search('foobar')"

Jinja2

Source

  • Multiple if conditions (also inside templates
{% if ( (foo == 'foo') and
  (bar == 'bar') ) %}
{% endif %}

Inventories

YAML syntax with variable definition

Source

---
all:
  hosts:
    hosta:
      settinga: vala
      settingb: valb
    hostb:
  vars:
    settingc: valc
group1
  hosts:
    hostb
  vars:
    settingd: vald

Inventory parameters

Source

My most used ones:

  • ansible_host: The name of the host to connect to, if different from the alias you wish to give to it.
  • ansible_port: The ssh port number, if not 22.
  • ansible_user: The default ssh user name to use.
  • ansible_ssh_private_key_file: Private key file used by ssh. Useful if using multiple keys and you don’t want to use SSH agent.
  • ansible_become: Equivalent to ansible_sudo or ansible_su, allows to force privilege escalation.

Variables

Variable as index

Source

You can use the content of a variable as index to access other values. See the following example.

- hosts: localhost
  gather_facts: false
  become: false
  vars:
    input: 'FrontEnd'
    network:
      subnets:
        frontend: 'Here is the frontend.'
        middletier: 'Here is the middletier.'
        backend: 'Here is the backend.'
  tasks:
    - block:
      - debug:
          msg: "Input: {{ input }}"
      - debug:
          msg: "{{ vars.network.subnets[input|lower] }}"
      delegate_to: localhost

While the input here is "FrontEnd", the second debug task outputs the string: "Here is the frontend." This allows you to gather settings e.g. using a survey in Ansible Tower and set the required values.


Collections

Initialize:

ansible-galaxy collection init <namespace>.<collection>

Errors

URI

Unhashable type

"Source"

When passing json content to the URI body parameter, the error message "Unhashable Type" might show up. Try to add the parameter "body_format: json" to the uri module (and remove any "convert_data: False"). The content type header parameter then can be ommitted.

git module

dict object is not a valid boolean

When using the parameter update in the ansible git module, a situation might occure, where this error occurs:

dict object is not a valid boolean ...

Adding an explicit boolean filter might solve this:

# Fails #1
git:
  ...
  update: "{{ item.update | default(omit) }}"
  ...

# Works
git:
  ...
  update: "{{ item.update | default(omit) | bool }}"
  ...

Current time

Source

It's possible to use the current time in playbooks, e.g. to tag virtual hosts with a timestamp of their creation.

- hosts: all
  gather_facts: true
  tasks:
    - name: Output
      debug:
        msg: "{{ ansible_date_time.date }} {{ ansible_date_time.time }}"

However, this showed to output the local time without timezone adjustments on some hosts.

An alternative method would be:

Source

- hosts: all
  gather_facts: false
  tasks:
    - name: Output
      debug:
        msg: "{{ lookup('pipe', TZ=\":Europe/Oslo\" date \"+%Y-%m-%d %H:%M:%S\"') }}"

Ansible Vault

Encrypt string from stdin

ansible-vault encrypt_string --ask-vault-pass --stdin-name <variable>

Filter

version_compare

Source

- hosts: "localhost"
  gather_facts: false
  vars:
    version: "2.7.11"
  tasks:
    - name: check if version is within acceptable range
      assert:
        that:
          - version is version_compare('2.7', '>=', strict=True)
          - version is version_compare('4.0', '<', strict=True)

The parameter strict=true requires at least one dot and does not allow more than two dots.


On True and False

Source

Ansible replaces yes|no in variables with True|False and violates the principal of least suprices by that.

This can only enforced in YAML by either quoting it or defining the value as a string

# Option 1
variable: 'no'

# Optino 2
variable: !!str no

Backspace control

Source

When putting backslashes in Yaml strings and handling them in Ansible within Jinja, there are interessting sideeffects:

---
- hosts: all
  vars:
    foo:
      bar: \ue9ff
  tasks:
    - name: testfile
      copy:
      dest: /tmp/testfile
      content: "{{ foo }}"

This will result in a file /tmp/testfile with the following content:

{"bar": "\\ue9ff"}

Trying to escape this further will not resolve the issue:

format result
\ue9ff \\ue9ff
\\ue9ff \\\\ue9ff
\\\ue9ff \\\\\\ue9ff
'\ue9ff' \\ue9ff

The solution to this is to put the string into double quotes instead.

---
- hosts: all
  vars:
    foo:
      bar: "\ue9ff"
  tasks:
    - name: testfile
      copy:
      dest: /tmp/testfile
      content: "{{ foo }}"
# Result:
{"bar": "\ue9ff"}

Whitespace control

Source

Jinja templates are not easy to write readable.

The default whitespace control settings for ansible templates are

  • lstrip_blocks: false

If this is set to True leading spaces and tabs are stripped from the start of a line to a block.

  • trim_blocks: true

If this is set to True the first newline after a block is removed (block, not variable tag!).

Add this line with either combination as first line in the Ansible Jinja template to alter the settings:

#jinja2: lstrip_blocks: "True (or False)", trim_blocks: "True (or False)"

Color output

Source

To force colored output in a non TTY environment (like in CI/CD), set these variables:

PY_COLORS=1
ANSIBLE_FORCE_COLOR=1

Run ansible with shebang

You can run Ansible playbooks with Ansible as interpreter:

touch hello_world.ans
chmod u+x hello_world.ans
#!/usr/bin/env -S ansible-playbook -i localhost,
---
# file: hello_world.ans

- name: Output some test string
  hosts: localhost
  connection: local
  gather_facts: false
  tasks:

     - name: Output something
       ansible.builtin.debug:
         msg: "Hello World:"

type test

Source

These type tests are preferably towards ansible.builtin.type_debug because the work reliably and work independent of the variable notation of --extra-vars.

tasks:
  - name: "String interpretation"
    vars:
      a_string: "A string"
      a_dictionary: {"a": "dictionary"}
      a_list: ["a", "list"]
    assert:
      that:
      # Note that a string is classed as also being "iterable" and "sequence", but not "mapping"
      - a_string is string and a_string is iterable and a_string is sequence and a_string is not mapping

      # Note that a dictionary is classed as not being a "string", but is "iterable", "sequence" and "mapping"
      - a_dictionary is not string and a_dictionary is iterable and a_dictionary is mapping

      # Note that a list is classed as not being a "string" or "mapping" but is "iterable" and "sequence"
      - a_list is not string and a_list is not mapping and a_list is iterable

  - name: "Number interpretation"
    vars:
      a_float: 1.01
      a_float_as_string: "1.01"
      an_integer: 1
      an_integer_as_string: "1"
    assert:
      that:
      # Both a_float and an_integer are "number", but each has their own type as well
      - a_float is number and a_float is float
      - an_integer is number and an_integer is integer

      # Both a_float_as_string and an_integer_as_string are not numbers
      - a_float_as_string is not number and a_float_as_string is string
      - an_integer_as_string is not number and a_float_as_string is string

      # a_float or a_float_as_string when cast to a float and then to a string should match the same value cast only to a string
      - a_float | float | string == a_float | string
      - a_float_as_string | float | string == a_float_as_string | string

      # Likewise an_integer and an_integer_as_string when cast to an integer and then to a string should match the same value cast only to an integer
      - an_integer | int | string == an_integer | string
      - an_integer_as_string | int | string == an_integer_as_string | string

      # However, a_float or a_float_as_string cast as an integer and then a string does not match the same value cast to a string
      - a_float | int | string != a_float | string
      - a_float_as_string | int | string != a_float_as_string | string

      # Again, Likewise an_integer and an_integer_as_string cast as a float and then a string does not match the same value cast to a string
      - an_integer | float | string != an_integer | string
      - an_integer_as_string | float | string != an_integer_as_string | string

  - name: "Native Boolean interpretation"
    loop:
    - yes
    - true
    - True
    - TRUE
    - no
    - No
    - NO
    - false
    - False
    - FALSE
    assert:
      that:
      # Note that while other values may be cast to boolean values, these are the only ones that are natively considered boolean
      # Note also that `yes` is the only case-sensitive variant of these values.
      - item is boolean

Include vs. import

Source

  include_* Import_*
Type of re-use Dynamic Static
When processed At runtime, when encountered Pre-processed during playbook parsing
Task or play All includes are tasks import_playbooki cannot be a task
Task options Apply only to include task itself Apply to all child tasks in import
Calling from loops Executed once for each loop item Cannot be used in a loop
Using --list-tags Tags within includes not listed All tags appear with --list-tags
Using --list-tasks Tasks within includes not listed All tasks appear with --list-tasks
Notifying handlers Cannot trigger handlers within includes Can trigger individual imported handlers
Using --start-at-task Cannot start at tasks within includes Can start at imported tasks
Using inventory variables Can include_*: {{ inventory_var }} Cannot import_*: {{ inventory_var }}
With playbooks No include_playbook Can import full playbooks
With variables files Can include variables files Use vars_files: to import variables

AAP/AWX RBAC Permissions

Source

System Role What it can do
System Administrator - System wide singleton Manages all aspects of the system
System Auditor - System wide singleton Views all aspects of the system
Ad Hoc Role - Inventory Runs ad hoc commands on an Inventory
Admin Role - Organizations, Teams, Inventory, Projects, Job Templates Manages all aspects of a defined Organization, Team, Inventory, Project, or Job Template
Auditor Role - All Views all aspects of a defined Organization, Team, Inventory, Project, or Job Template
Execute Role - Job Templates Runs assigned Job Template
Member Role - Organization, Team Manages all of the settings associated with that Organization or Team
Read Role - Organizations, Teams, Inventory, Projects, Job Templates Views all aspects of a defined Organization, Team, Inventory, Project, or Job Template
Update Role - Project Updates the Project from the configured source control management system
Update Role - Inventory Updates the Inventory using the cloud source update system
Owner Role - Credential Owns and manages all aspects of this Credential
Use Role - Credential, Inventory, Project, IGs, CGs Uses the Credential, Inventory, Project, IGs, or CGs in a Job Template

Variable precedence

Source This is the order for which AWX/Ansible Tower overwrites variables:

  1. Workflow Job Launch extra variables : Always wins
  2. Workflow Job Template Survey: Documentation
  3. Workflow Job Template extra variables: Documentation
  4. Job artifacts: Documentation

This is the order for which variable overwrites which in Ansible: Source

  1. --extra-vars: Always win precedence and is defined on the command line.
  2. include params:
  3. role (and include_role) params
  4. set_facts / registered vars
  5. include_vars: Load variables from files: Documentation
  6. task vars (only for the task)
  7. block vars (only for tasks in block)
  8. role vars: <role>/vars/main.yml Documentation
  9. play vars_files
  10. play vars_prompt
  11. play vars
  12. host facts / cached set_facts
  13. playbook host_vars/*: Documentation
  14. inventory host_vars/*
  15. inventory file or script host vars
  16. playbook group_vars/*
  17. inventory group_vars/*
  18. playbook group_vars/all
  19. inventory group_vars/all: Documentation
  20. inventory file or script group vars: Documentation
  21. role defaults: <role>/defaults/main.yml, lowest possible place to define variables.
  22. Command line values (for example, -u user, these are not variables)