Today I added a WordPress blog to my site…and it was pretty good.

Today I added a WordPress blog to my site…cool story bro.

I wanted to bolt a blog on to my existing site over running a separate WordPress instance. The reason for this was basically to increase my skills with GCP, ansible, PHP, Python and MySQL.

If all you are after is to get a WordPress site up and running, then its probably much easier to go with the GCP Marketplace WP instance and walk away. I initially tried this and it was almost too easy. You more or less fill out a form and its built for you, and all of the automation behind this is available and unit tested.

That being said my work is not fully absent of a quest for shortcuts. I played with the Google Marketplace instance, a canned WP ansible role, and installing WP manually. At this point I don’t remember what I ended with, but it didn’t matter once I learned how to deploy the existing site repeatably. The key here was two google buckets, one for the WP content files, and another for a restorable database dump.

Here is how this site is built and deployed:

The site is deployed across two Google Cloud instances. The first runs nginx and PHP. It serves a static webpage, as well as the WordPress site. The second is a database instance.

The two instances are fully orchestrated using ansible. When you run the playbook, it builds the two instances, and deploys roles to them.

The playbook pulls the static/wordpress content, along with a database dump from Google Storage buckets.

To facilitate updates and repeatability, a python script is used to keep the buckets updated.

Security??…Gosh I hope so.
Our variables are encrypted using Ansible Vault. Whats cool is under the hood, we use a private-key file to encrypt everything…no passphrase at command line:

Deploying to Google Cloud using Ansible.

Here is the primary playbook:

- name: Create instance(s)
  hosts: localhost
  connection: local
  gather_facts: yes

  vars:
    service_account_email: XXXXXX-compute@developer.gserviceaccount.com
    credentials_file: /Users/tsimson/project-name.json
    project_id: project-name
    machine_type: g1-small
    image: centos-7
  tasks:

   - name: Launch Web Host
     gce:
         network: ts-tech-vpc-1
         subnetwork: ts-tech-vpc-1
         zone: us-east1-b
         instance_names: ts-web-host-1
         machine_type: "{{ machine_type }}"
         image: "{{ image }}"
         service_account_permissions: storage-full
         service_account_email: "{{ service_account_email }}"
         credentials_file: "{{ credentials_file }}"
         project_id: "{{ project_id }}"
         ip_forward: True
         tags: [ssh, http-server, https-server, subnet]
     register: gce_web_host_1
   - debug: var=gce_web_host_1

   - name: Wait for SSH to come up
     wait_for: host={{ item.public_ip }} port=22 delay=10 timeout=300
     with_items: "{{ gce_web_host_1.instance_data }}"

   - name: Add host to groupname
     add_host: hostname={{ item.name }} ansible_ssh_host={{ item.public_ip }} groupname=ts-web-hosts
     with_items: "{{ gce_web_host_1.instance_data }}"

   - name: Launch DB Host
     gce:
         network: ts-tech-vpc-1
         subnetwork: ts-tech-vpc-1
         zone: us-east1-b
         instance_names: ts-db-host-1
         machine_type: "{{ machine_type }}"
         image: "{{ image }}"
         service_account_email: "{{ service_account_email }}"
         credentials_file: "{{ credentials_file }}"
         project_id: "{{ project_id }}"
         ip_forward: True
         tags: [ssh, subnet]
     register: gce_db_host_1
   - debug: var=gce_db_host_1

   - name: Wait for SSH to come up
     wait_for: host={{ item.public_ip }} port=22 delay=10 timeout=300
     with_items: "{{ gce_db_host_1.instance_data }}"

   - name: Add host to groupname
     add_host: hostname={{ item.name }} ansible_ssh_host={{ item.public_ip }} groupname=ts-db-hosts
     with_items: "{{ gce_db_host_1.instance_data }}"

- name: Manage web-host
  hosts: ts-web-hosts
  connection: ssh
  sudo: True
  roles:
    - role: web_host_role
    - role: wordpress

- name: Manage db-host
  hosts: ts-db-hosts
  connection: ssh
  sudo: True
  roles:
    - role: db_host_role

Here is the directory structure that ansible uses:

ts_tech_web_public$ tree
.
├── README.md
├── ansible.cfg
├── create_ts_tech_website.retry
├── create_ts_tech_website.yml
├── group_vars
│   └── all
├── host_vars
├── hosts
└── roles
    ├── db_host_role
    │   ├── files
    │   │   └── my.cnf
    │   ├── tasks
    │   │   └── main.yml
    │   └── templates
    ├── web_host_role
    │   ├── files
    │   ├── tasks
    │   │   └── main.yml
    │   └── templates
    │       ├── nginx.conf
    │       └── wp_db_gb_sync.py
    └── wordpress
        ├── README.md
        ├── defaults
        │   └── main.yml
        ├── files
        ├── handlers
        │   └── main.yml
        ├── meta
        │   └── main.yml
        ├── tasks
        │   └── main.yml
        ├── templates
        │   └── wp-config.php
        ├── tests
        │   ├── inventory
        │   └── test.yml
        └── vars
            └── main.yml

20 directories, 20 files

This python script is run to keep the content and database updated off platform. You run this after making live changes to maintain repeatability:

Python script …for the boys:   wp_db_gb_sync.py

from google.cloud import storage
import os
import glob
from subprocess import Popen
import requests
import time

storage_client = storage.Client()
ts_wp_files_bucket = storage_client.get_bucket('ts-wp-files')
ts_wp_db_dump_bucket = storage_client.get_bucket('ts-wp-db-dump')
test_path = ''

os.chdir('/root')

Popen('mysqldump -h {{ hostvars['localhost']['gce_db_host_1']['instance_data'][0]['private_ip'] }} -u {{ mysql_user }} -p{{ mysql_pass }} --databases ts_tech_db > ts_tech_db.sql', shell=True)

response = requests.request("GET", "http://metadata/computeMetadata/v1/instance/network-interfaces/0/access-configs/0/external-ip", headers={"Metadata-Flavor": "Google"})

pub_ip = response.content

time.sleep(5)

with open("ts_tech_db.sql", 'r') as f:
    s = f.read()
    x = s.replace(pub_ip, "www.tangosierratech.com")

with open("ts_tech_db.sql", 'w') as f:
    f.write(x)

def copy_local_file_to_gcs(bucket, local_file):
    blob = bucket.blob(local_file)
    blob.upload_from_filename(local_file)

copy_local_file_to_gcs(ts_wp_db_dump_bucket, 'ts_tech_db.sql')

os.chdir('/var/www/')

def copy_local_directory_to_gcs(bucket, local_path):
    """Recursively copy a directory of files to GCS.

    local_path should be a directory and not have a trailing slash.
    """
    assert os.path.isdir(local_path)
    def walk(local_path):
        for path in glob.glob(local_path + '/**'):
            if os.path.isdir(path):
                walk(path)
            else:
                if test_path:
                    remote_path = os.path.join(test_path, path)
                    print remote_path
                    blob = bucket.blob(remote_path)
                    blob.upload_from_filename(path)
                else:
                    remote_path = path
                    print remote_path
                    blob = bucket.blob(remote_path)
                    blob.upload_from_filename(path)

    walk(local_path)

copy_local_directory_to_gcs(ts_wp_files_bucket, 'html')

 

Leave a Reply